# Bivariate Distributions

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 Chapter 4 Bivariate Distributions 4.1 Distributions of Two Random Variables In many practical cases it is desirable to take more than one measurement of a random observation: (brief examples) 1. What is the relationship between the high school rank x and the ACT score y of incoming college students? How to use these measurements to predict the first-year college GPA z with a function z = v(x, y)? 2. The relationship between the running velocity x, running time y, and heart rate z of a runner. We begin with bivariate distributions for the discrete case. The continuous case is essentially the same, but with integrals replacing summations. Def. Let X and Y be two random variables defined on a discrete space. Let S denote the two-dimensional space of X and Y. The function f(x, y) := P (X = x, Y = y) is called the joint probability mass function (joint p.m.f.) of X and Y and has the following properties: 1. 0 f(x, y) f(x, y) = 1. (x,y) S 3. P [(X, Y ) A] = f(x, y), where A S. (x,y) A 43

2 44 CHAPTER 4. BIVARIATE DISTRIBUTIONS Ex 1, p.180 (scanned file) Def. Let X and Y have the joint p.m.f. f(x, y) with space S. 1. The marginal p.m.f. of X is defined by f 1 (x) = P (X = x) = y f(x, y), x S 1 (the x space), where the summation is taken over all possible y values for given x. 2. The marginal p.m.f. of Y is defined by f 2 (y) = P (Y = y) = x f(x, y), y S 2 (the y space), where the summation is taken over all possible x values for given y. 3. The r.v.s X and Y are independent iff for all (x, y) S, P (X = x, Y = y) = P (X = x) P (Y = y), or equivalently, f(x, y) = f 1 (x)f 2 (y), (x, y) S. Otherwise, X and Y are said to be dependent. Ex 2-4, p (scanned file) The histogram of a joint p.m.f. f(x, y) is sketched by drawing rectangle columns with bases on S. Each column is centered at some (x, y) S, with base a 1 1 unit square and height f(x, y). Ex 5, p (scanned file) Sometimes it is convenient to replace X and Y by X 1 and X 2. Let r.v.s X 1 and X 2 have the joint p.m.f. f(x 1, x 2 ) defined on the space S. Let u(x 1, X 2 ) be a function of these two r.v.s. The mathematical expectation or expected value of u(x 1, X 2 ) is E[u(X 1, X 2 )] = u(x 1, x 2 )f(x 1, x 2 ). (x 1,x 2 ) S

3 4.1. DISTRIBUTIONS OF TWO RANDOM VARIABLES 45 Ex 1. Set u(x 1, X 2 ) = X i for a fixed i = 1, 2. The mean/expected value of X i is: E[u(X 1, X 2 )] = E(X i ) = µ i. 2. Set v(x 1, X 2 ) = (X i µ i ) 2 for a fixed i = 1, 2. The variance of X i is: E[v(X 1, X 2 )] = E[(X i µ i ) 2 ] = σ 2 i = Var(X i ). Ex 6, p.184 (scanned file) For two random variables of continuous type, we simply replace joint p.m.f. by joint probability density function (joint p.d.f.), and replace summations by integrals. Therefore, the joint p.d.f. f(x, y) of two continuous-type r.v.s X and Y satisfies the following properties: 1. f(x, y) 0, and f(x, y) = 0 iff (x, y) is not in the space S of X and Y. 2. f(x, y) dx dy = P [(X, Y ) A] = f(x, y) dx dy, where {(X, Y ) A} is an event A defined in the plane. Ex 7-8, p (scanned file) Two r.v.s X and Y of continuous type are independent iff f(x, y) = f 1 (x)f 2 (y), (x, y) S, where f 1 (x) = P (X = x) is the marginal p.d.f. of X, and f 2 (y) = P (Y = y) is the marginal p.d.f. of Y. Return to two r.v.s of discrete type. The next example is an extension of hypergeometric distribution: Ex 9, p.187 (scanned file)

4 46 CHAPTER 4. BIVARIATE DISTRIBUTIONS We now extend binomial distribution to trinomial distribution. Suppose that in a trial there are three outcomes: perfect (with probability p 1 ), seconds (with probability p 2 ), and defective (with probability p 3 = 1 p 1 p 2 ). We repeat the trials n independent times. Let X i (i = 1, 2, 3) denote the numbers of each outcome in n trials. The probability that we get x 1 perfects, x 2 seconds, and n x 1 x 2 defectives, is a trinomial p.m.f. f(x 1, x 2 ) = P (X 1 = x 1, X 2 = x 2 ( ) n = x 1, x 2, n x 1 x 2 n! = x 1!x 2!(n x 1 x 2 )! px 1 1 px 2 2 (1 p 1 p 2 ) n x 1 x 2. Clearly X 1 and X 2 are dependent. Ex 10-11, p (scanned file) Homework 4.1 1, 3, 5, 7, 9, 11 Attachment: Scanned textbook pages of Section 4-1

5 4.2. THE CORRELATION COEFFICIENT The Correlation Coefficient Let X 1, X 2 be two r.v.s. The mathematical expectation of a function of two r.v.s, say u(x 1, X 2 ), have been defined. In particular, the mean and variance of X i are: µ i = E(X i ), σ 2 i = E[(X i µ i ) 2 ]. Now we study the mathematical expectations of some other special functions: 1. Set u(x 1, X 2 ) = (X 1 µ 1 )(X 2 µ 2 ). The covariance of X 1 and X 2 is σ 12 := Cov(X 1, X 2 ) := E[u(X 1, X 2 )] = E[(X 1 µ 1 )(X 2 µ 2 )]. 2. The correlation coefficient of X 1 and X 2 is Ex 1, p.191 (scanned file) ρ = Cov(X 1, X 2 ) σ 1 σ 2 = σ 12 σ 1 σ 2. Thm 4.1 (Properties of correlation coefficient ρ) ρ 1. Roughly speaking, ρ = 1 if X 1 µ 1 and X 2 µ 2 goes proportionally in the same direction; ρ = 1 if X 1 µ 1 and X 2 µ 2 goes proportionally in the opposite direction. 2. E(X 1 X 2 ) = µ 1 µ 2 + ρσ 1 σ The least squares regression line of X 1 and X 2 is the line through (µ 1, µ 2 ) with slope ρ σ 2 σ 1, that is, x 2 µ 2 σ 2 = ρ x 1 µ 1 σ 1. It is the best fit line of the bivariate distribution. That is, the line x 2 = a + bx 1 such that E[(X 2 a bx 1 ) 2 ] is minimal. Ex 2-3, p (scanned file) Homework 4.2 1, 3, 7, 11 Attachment: Scanned textbook pages of Section 4-2

6 48 CHAPTER 4. BIVARIATE DISTRIBUTIONS 4.3 Conditional Distributions Let X and Y have joint discrete distribution with joint p.m.f. f(x, y) on space S, marginal p.m.f. f 1 (x) on space S 1, marginal p.m.f. f 2 (y) on space S 2. The conditional probability of event {X = x} given {Y = y} is P (A B) = P (A B) P (B) = f(x, y) f 2 (y). Def. The conditional p.m.f. of X, given that Y = y, is g(x y) = f(x, y) f 2 (y), provided that f 2(y) > 0. The conditional p.m.f. of Y, given that X = x, is h(y x) = f(x, y) f 1 (x), provided that f 1(x) > 0. The conditional p.m.f.s g(x y) and h(y x) work like the p.m.f.s of one r.v. Ex 1, p.197 (scanned file) Def. The conditional mean of Y, given that X = x, is µ Y x = E(Y x) = y y h(y x). The conditional variance of Y, given that X = x, is σ 2 Y x = E { [Y E(Y x)] 2 x } = y [y E(Y x)] 2 h(y x) Ex 2-4, p (scanned file) = E(Y 2 x) [E(Y x)] 2.

7 4.3. CONDITIONAL DISTRIBUTIONS 49 Conditional distributions are similarly defined for continuous r.v.s. Ex 5, p.202 (scanned file) Homework 4.3 1, 7, 9, 11, 13, 19 Attachment: Scanned textbook pages of Section 4-3

### Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single

### Jointly Distributed Random Variables

Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................

### Definition The covariance of X and Y, denoted by cov(x, Y ) is defined by. cov(x, Y ) = E(X µ 1 )(Y µ 2 ).

Correlation Regression Bivariate Normal Suppose that X and Y are r.v. s with joint density f(x y) and suppose that the means of X and Y are respectively µ 1 µ 2 and the variances are 1 2. Definition The

### Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

### This is a good time to refresh your memory on double-integration. We will be using this skill in the upcoming lectures.

Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 1: Sections 5-1.1 to 5-1.4 For both discrete and continuous random variables we will discuss the following... Joint Distributions (for two or more r.v. s)

### 2.8 Expected values and variance

y 1 b a 0 y 0.0 0.2 0.4 0.6 0.8 1.0 1.2 0 a b (a) Probability density function x 0 a b (b) Cumulative distribution function x Figure 2.3: The probability density function and cumulative distribution function

### Review Exam Suppose that number of cars that passes through a certain rural intersection is a Poisson process with an average rate of 3 per day.

Review Exam 2 This is a sample of problems that would be good practice for the exam. This is by no means a guarantee that the problems on the exam will look identical to those on the exam but it should

### Lecture 18 Chapter 6: Empirical Statistics

Lecture 18 Chapter 6: Empirical Statistics M. George Akritas Definition (Sample Covariance and Pearson s Correlation) Let (X 1, Y 1 ),..., (X n, Y n ) be a sample from a bivariate population. The sample

### Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

### Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

### ST 371 (VIII): Theory of Joint Distributions

ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or

### Random Variables, Expectation, Distributions

Random Variables, Expectation, Distributions CS 5960/6960: Nonparametric Methods Tom Fletcher January 21, 2009 Review Random Variables Definition A random variable is a function defined on a probability

### Joint Distributions. Tieming Ji. Fall 2012

Joint Distributions Tieming Ji Fall 2012 1 / 33 X : univariate random variable. (X, Y ): bivariate random variable. In this chapter, we are going to study the distributions of bivariate random variables

### Topic 4: Multivariate random variables. Multiple random variables

Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly

### MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 2004-2012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................

### Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall

### , where f(x,y) is the joint pdf of X and Y. Therefore

INDEPENDENCE, COVARIANCE AND CORRELATION M384G/374G Independence: For random variables and, the intuitive idea behind " is independent of " is that the distribution of shouldn't depend on what is. This

### 4. Joint Distributions of Two Random Variables

4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint

### 3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

### Expected Value. Let X be a discrete random variable which takes values in S X = {x 1, x 2,..., x n }

Expected Value Let X be a discrete random variable which takes values in S X = {x 1, x 2,..., x n } Expected Value or Mean of X: E(X) = n x i p(x i ) i=1 Example: Roll one die Let X be outcome of rolling

### Random Variables. Consider a probability model (Ω, P ). Discrete Random Variables Chs. 2, 3, 4. Definition. A random variable is a function

Rom Variables Discrete Rom Variables Chs.,, 4 Rom Variables Probability Mass Functions Expectation: The Mean Variance Special Distributions Hypergeometric Binomial Poisson Joint Distributions Independence

### MATH 201. Final ANSWERS August 12, 2016

MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6-sided dice, five 8-sided dice, and six 0-sided dice. A die is drawn from the bag and then rolled.

### POL 571: Expectation and Functions of Random Variables

POL 571: Expectation and Functions of Random Variables Kosuke Imai Department of Politics, Princeton University March 10, 2006 1 Expectation and Independence To gain further insights about the behavior

### Lecture Notes 1. Brief Review of Basic Probability

Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

### 3.6: General Hypothesis Tests

3.6: General Hypothesis Tests The χ 2 goodness of fit tests which we introduced in the previous section were an example of a hypothesis test. In this section we now consider hypothesis tests more generally.

### 6. Jointly Distributed Random Variables

6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.

### MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

### Covariance and Correlation. Consider the joint probability distribution f XY (x, y).

Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 2: Section 5-2 Covariance and Correlation Consider the joint probability distribution f XY (x, y). Is there a relationship between X and Y? If so, what kind?

### Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

### Random Variables of The Discrete Type

Random Variables of The Discrete Type Definition: Given a random experiment with an outcome space S, a function X that assigns to each element s in S one and only one real number x X(s) is called a random

### Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

### Lesson 5 Chapter 4: Jointly Distributed Random Variables

Lesson 5 Chapter 4: Jointly Distributed Random Variables Department of Statistics The Pennsylvania State University 1 Marginal and Conditional Probability Mass Functions The Regression Function Independence

### Chapter 4. Multivariate Distributions

1 Chapter 4. Multivariate Distributions Joint p.m.f. (p.d.f.) Independent Random Variables Covariance and Correlation Coefficient Expectation and Covariance Matrix Multivariate (Normal) Distributions Matlab

### Conditional expectation

A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined

### Discrete and Continuous Random Variables. Summer 2003

Discrete and Continuous Random Variables Summer 003 Random Variables A random variable is a rule that assigns a numerical value to each possible outcome of a probabilistic experiment. We denote a random

### The Expected Value of X

3.3 Expected Values The Expected Value of X Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X) or μ X or just μ, is 2 Example

### Properties of Expected values and Variance

Properties of Expected values and Variance Christopher Croke University of Pennsylvania Math 115 UPenn, Fall 2011 Expected value Consider a random variable Y = r(x ) for some function r, e.g. Y = X 2 +

### Review. Lecture 3: Probability Distributions. Poisson Distribution. May 8, 2012 GENOME 560, Spring Su In Lee, CSE & GS

Lecture 3: Probability Distributions May 8, 202 GENOME 560, Spring 202 Su In Lee, CSE & GS suinlee@uw.edu Review Random variables Discrete: Probability mass function (pmf) Continuous: Probability density

### We have discussed the notion of probabilistic dependence above and indicated that dependence is

1 CHAPTER 7 Online Supplement Covariance and Correlation for Measuring Dependence We have discussed the notion of probabilistic dependence above and indicated that dependence is defined in terms of conditional

### Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 3-7, 3-8 The remaining discrete random

### STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case

STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case Pengyuan (Penelope) Wang June 20, 2011 Joint density function of continuous Random Variable When X and Y are two continuous

### Chapter 5: Multivariate Distributions

Chapter 5: Multivariate Distributions Professor Ron Fricker Naval Postgraduate School Monterey, California 3/15/15 Reading Assignment: Sections 5.1 5.12 1 Goals for this Chapter Bivariate and multivariate

### ECE302 Spring 2006 HW7 Solutions March 11, 2006 1

ECE32 Spring 26 HW7 Solutions March, 26 Solutions to HW7 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

### Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015.

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment -3, Probability and Statistics, March 05. Due:-March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x

### Continuous Distributions

MAT 2379 3X (Summer 2012) Continuous Distributions Up to now we have been working with discrete random variables whose R X is finite or countable. However we will have to allow for variables that can take

### Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

### Examination 110 Probability and Statistics Examination

Examination 0 Probability and Statistics Examination Sample Examination Questions The Probability and Statistics Examination consists of 5 multiple-choice test questions. The test is a three-hour examination

### Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304. jones/courses/141

Math 141 Lecture 7: Variance, Covariance, and Sums Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Last Time Variance: expected squared deviation from the mean: Standard

### Sections 2.11 and 5.8

Sections 211 and 58 Timothy Hanson Department of Statistics, University of South Carolina Stat 704: Data Analysis I 1/25 Gesell data Let X be the age in in months a child speaks his/her first word and

### An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

An-Najah National University Faculty of Engineering Industrial Engineering Department Course : Quantitative Methods (65211) Instructor: Eng. Tamer Haddad 2 nd Semester 2009/2010 Chapter 5 Example: Joint

### 10 BIVARIATE DISTRIBUTIONS

BIVARIATE DISTRIBUTIONS After some discussion of the Normal distribution, consideration is given to handling two continuous random variables. The Normal Distribution The probability density function f(x)

### Topic 8 The Expected Value

Topic 8 The Expected Value Functions of Random Variables 1 / 12 Outline Names for Eg(X ) Variance and Standard Deviation Independence Covariance and Correlation 2 / 12 Names for Eg(X ) If g(x) = x, then

### Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

### Homework Zero. Max H. Farrell Chicago Booth BUS41100 Applied Regression Analysis. Complete before the first class, do not turn in

Homework Zero Max H. Farrell Chicago Booth BUS41100 Applied Regression Analysis Complete before the first class, do not turn in This homework is intended as a self test of your knowledge of the basic statistical

### EE 322: Probabilistic Methods for Electrical Engineers. Zhengdao Wang Department of Electrical and Computer Engineering Iowa State University

EE 322: Probabilistic Methods for Electrical Engineers Zhengdao Wang Department of Electrical and Computer Engineering Iowa State University Discrete Random Variables 1 Introduction to Random Variables

### Topic 2: Scalar random variables. Definition of random variables

Topic 2: Scalar random variables Discrete and continuous random variables Probability distribution and densities (cdf, pmf, pdf) Important random variables Expectation, mean, variance, moments Markov and

### Joint distributions Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014

Joint distributions Math 17 Probability and Statistics Prof. D. Joyce, Fall 14 Today we ll look at joint random variables and joint distributions in detail. Product distributions. If Ω 1 and Ω are sample

### Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage

4 Continuous Random Variables and Probability Distributions Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage Continuous r.v. A random variable X is continuous if possible values

### Joint Continuous Distributions

Joint Continuous Distributions Statistics 11 Summer 6 Copright c 6 b Mark E. Irwin Joint Continuous Distributions Not surprisingl we can look at the joint distribution of or more continuous RVs. For eample,

### Joint Distribution and Correlation

Joint Distribution and Correlation Michael Ash Lecture 3 Reminder: Start working on the Problem Set Mean and Variance of Linear Functions of an R.V. Linear Function of an R.V. Y = a + bx What are the properties

### Probability and Statistical Methods. Chapter 4 Mathematical Expectation

Math 3 Chapter 4 Mathematical Epectation Mean of a Random Variable Definition. Let be a random variable with probability distribution f( ). The mean or epected value of is, f( ) µ = µ = E =, if is a discrete

### 4: Probability. What is probability? Random variables (RVs)

4: Probability b binomial µ expected value [parameter] n number of trials [parameter] N normal p probability of success [parameter] pdf probability density function pmf probability mass function RV random

### Math/Stat 370: Engineering Statistics, Washington State University

Math/Stat 370: Engineering Statistics, Washington State University Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 2 Haijun Li Math/Stat 370: Engineering Statistics,

### EE 302 Division 1. Homework 5 Solutions.

EE 32 Division. Homework 5 Solutions. Problem. A fair four-sided die (with faces labeled,, 2, 3) is thrown once to determine how many times a fair coin is to be flipped: if N is the number that results

### 5. Continuous Random Variables

5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

### Random Variables. Chapter 2. Random Variables 1

Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets

### Chapter 5: Joint Probability Distributions. Chapter Learning Objectives. The Joint Probability Distribution for a Pair of Discrete Random

Chapter 5: Joint Probability Distributions 5-1 Two or More Random Variables 5-1.1 Joint Probability Distributions 5-1.2 Marginal Probability Distributions 5-1.3 Conditional Probability Distributions 5-1.4

### Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution 8 October 2007 In this lecture we ll learn the following: 1. how continuous probability distributions differ

### SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,

### Recitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere

Recitation. Exercise 3.5: If the joint probability density of X and Y is given by xy for < x

### Statistiek (WISB361)

Statistiek (WISB361) Final exam June 29, 2015 Schrijf uw naam op elk in te leveren vel. Schrijf ook uw studentnummer op blad 1. The maximum number of points is 100. Points distribution: 23 20 20 20 17

### Worked examples Multiple Random Variables

Worked eamples Multiple Random Variables Eample Let X and Y be random variables that take on values from the set,, } (a) Find a joint probability mass assignment for which X and Y are independent, and

### M 384G/374G/CAM384T CONDITIONAL AND MARGINAL MEANS AND VARIANCES. I. Question: How are conditional means E(Y X) and marginal means E(Y) related?

1 M 3G/37G/CAM3T CONDITIONAL AND MARGINAL MEANS AND VARIANCES I. Question: How are conditional means E(Y X) and marginal means E(Y) related? Simple example: Population consisting of men, n women. Y = height

### Random Variables and Their Expected Values

Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution

### Statistics GCSE Higher Revision Sheet

Statistics GCSE Higher Revision Sheet This document attempts to sum up the contents of the Higher Tier Statistics GCSE. There is one exam, two hours long. A calculator is allowed. It is worth 75% of the

### Math 576: Quantitative Risk Management

Math 576: Quantitative Risk Management Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 4 Haijun Li Math 576: Quantitative Risk Management Week 4 1 / 22 Outline 1 Basics

### Exercises with solutions (1)

Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal

### BNG 202 Biomechanics Lab. Descriptive statistics and probability distributions I

BNG 202 Biomechanics Lab Descriptive statistics and probability distributions I Overview The overall goal of this short course in statistics is to provide an introduction to descriptive and inferential

3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately

### e.g. arrival of a customer to a service station or breakdown of a component in some system.

Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

### 3. Continuous Random Variables

3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such

### Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.

UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Neda Farzinnia, UCLA Statistics University of California,

### 1 Definition of Conditional Expectation

1 DEFINITION OF CONDITIONAL EXPECTATION 4 1 Definition of Conditional Expectation 1.1 General definition Recall the definition of conditional probability associated with Bayes Rule P(A B) P(A B) P(B) For

### Correlation in Random Variables

Correlation in Random Variables Lecture 11 Spring 2002 Correlation in Random Variables Suppose that an experiment produces two random variables, X and Y. What can we say about the relationship between

### Chapter 4 Expected Values

Chapter 4 Expected Values 4. The Expected Value of a Random Variables Definition. Let X be a random variable having a pdf f(x). Also, suppose the the following conditions are satisfied: x f(x) converges

### Chapter 3: Discrete Random Variable and Probability Distribution. January 28, 2014

STAT511 Spring 2014 Lecture Notes 1 Chapter 3: Discrete Random Variable and Probability Distribution January 28, 2014 3 Discrete Random Variables Chapter Overview Random Variable (r.v. Definition Discrete

### Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Math 370, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the

### Chapter 2, part 2. Petter Mostad

Chapter 2, part 2 Petter Mostad mostad@chalmers.se Parametrical families of probability distributions How can we solve the problem of learning about the population distribution from the sample? Usual procedure:

### Random Variable: A function that assigns numerical values to all the outcomes in the sample space.

STAT 509 Section 3.2: Discrete Random Variables Random Variable: A function that assigns numerical values to all the outcomes in the sample space. Notation: Capital letters (like Y) denote a random variable.

### Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

### CSE 312, 2011 Winter, W.L.Ruzzo. 7. continuous random variables

CSE 312, 2011 Winter, W.L.Ruzzo 7. continuous random variables continuous random variables Discrete random variable: takes values in a finite or countable set, e.g. X {1,2,..., 6} with equal probability

### Continuous random variables

Continuous random variables So far we have been concentrating on discrete random variables, whose distributions are not continuous. Now we deal with the so-called continuous random variables. A random

### COURSE OUTLINE. Course Number Course Title Credits MAT201 Probability and Statistics for Science and Engineering 4. Co- or Pre-requisite

COURSE OUTLINE Course Number Course Title Credits MAT201 Probability and Statistics for Science and Engineering 4 Hours: Lecture/Lab/Other 4 Lecture Co- or Pre-requisite MAT151 or MAT149 with a minimum

### Lecture Notes 3 Random Vectors. Specifying a Random Vector. Mean and Covariance Matrix. Coloring and Whitening. Gaussian Random Vectors

Lecture Notes 3 Random Vectors Specifying a Random Vector Mean and Covariance Matrix Coloring and Whitening Gaussian Random Vectors EE 278B: Random Vectors 3 Specifying a Random Vector Let X,X 2,...,X

### L10: Probability, statistics, and estimation theory

L10: Probability, statistics, and estimation theory Review of probability theory Bayes theorem Statistics and the Normal distribution Least Squares Error estimation Maximum Likelihood estimation Bayesian

### FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

### Lecture 8: Continuous random variables, expectation and variance

Lecture 8: Continuous random variables, expectation and variance Lejla Batina Institute for Computing and Information Sciences Digital Security Version: spring 2012 Lejla Batina Version: spring 2012 Wiskunde

### Simple Linear Regression Chapter 11

Simple Linear Regression Chapter 11 Rationale Frequently decision-making situations require modeling of relationships among business variables. For instance, the amount of sale of a product may be related