Joint Distributions. Tieming Ji. Fall 2012

Size: px
Start display at page:

Download "Joint Distributions. Tieming Ji. Fall 2012"

Transcription

1 Joint Distributions Tieming Ji Fall / 33

2 X : univariate random variable. (X, Y ): bivariate random variable. In this chapter, we are going to study the distributions of bivariate random variables joint distributions, both in discrete cases and continuous cases. Motivation: Many problems require the study of two random variables at the same time. For example, studying the effect of pesticide application time for the soybean yield; the population size and the crime rate relationship; the phenotype and genotype in agronomy studies; the yield of a chemical reaction in conjunction with the temperature at which the reaction is run, etc. 2 / 33

3 Discrete Cases Definition: Let X and Y be discrete random variables. The ordered pair (X, Y ) is called a two-dimensional discrete random variable. The joint (probability) density function f for (X, Y ) is defined as f (x, y) = P(X = x, Y = y). Definition: Function f is a valid discrete joint density function if and only if f (x, y) 0 for x, y R, and x R y R f (x, y) = 1. 3 / 33

4 Example (book page 157): In an automobile plant two tasks are performed by robots. The first entails welding two joints; the second, tightening three bolts. Let X denote the number of defective welds and Y the number of improperly tightened bolts produced per car. Past data indicates that the joint density for (X, Y ) is shown in the table below. x/y Is f (x, y) a proper joint density distribution? f (x, y) is a valid joint discrete density for (X, Y ) because (1) for any x and y, f (x, y) 0; and (2) x R y R f (x, y) = 2 3 x=0 y=0 f (x, y) = = 1. 4 / 33

5 2. What is P(X = 1, Y = 0)? P(X = 1, Y = 0) = What is the probability that there is no improperly tightened bolts? Solution: When there is no improperly tightened bolts, Y =0. So, we want to compute P(Y = 0). We have P(Y = 0) = P(X = 0, Y = 0) + P(X = 1, Y = 0) + P(X = 2, Y = 0) = = / 33

6 Definition: The marginal probability density functions f X (x) for X given the joint probability density function f X,Y (x, y) is given by f X (x) = f X,Y (x, y). y R Similarly, the marginal probability density function f Y (y) for Y is given by f Y (y) = f X,Y (x, y). x R 6 / 33

7 Example: Compute the marginal distributions of X and Y given the table in the last example. x/y f X (x) f Y (y) / 33

8 Continuous Cases Definition: Let X and Y be continuous random variables. The ordered pair (X, Y ) is called a two-dimensional continuous random variable. The probability that X [a, b] and Y [c, d] given the joint continuous probability density f X,Y (x, y) is computed by P(a X b, c Y d) = b d a c f X,Y (x, y)dydx. Definition: Function f is proper continuous joint density function if and only if f (x, y) 0 for x, y R, and f X,Y (x, y)dydx = 1. 8 / 33

9 Example: In a healthy individual age 20 to 29 years, the calcium level in the blood, X, is usually between 8.5 and 10.5 milligrams per deciliter (mg/dl) and the cholesterol level, Y is usually between 120 and 240 mg/dl. Assume that for a healthy individual in this age group, X is uniformly distributed in the range (8.5, 10.5), and Y is uniformly distributed on (120, 240). Thus, the density function is f X,Y (x, y) = 1 240, 8.5 x 10.5 and 120 y 240; 0, o.w. 1. Verify that f X,Y is a proper probability density function. Solution: (1) For any x R and y R, f (x, y) is non-negative. (2) f (x, y)dydx = dydx = 1. 9 / 33

10 2. Compute the probability that a healthy person in this age group will have calcium level between 9 and 10 mg/dl and cholesterol level between 125 and 140 mg/dl. Solution: This is to compute P(9 X 10, 125 Y 140). P(9 X 10, 125 Y 140) = = = dydx ( )dx 10 / 33

11 Definition: Let (X, Y ) be a two-dimensional continuous random variable with joint density f X,Y. The marginal density for X, denoted by f X, is given by f X (x) = f X,Y (x, y)dy. The marginal density for Y, denoted by f Y, is given by f Y (y) = f X,Y (x, y)dx. 11 / 33

12 Example: Continued with the last example. 1. Compute the marginal distributions for X and Y. Solution: f X (x) = When x < 8.5 or x > 10.5, f X (x) = 0. f Y (y) = When y < 120 or y > 240, f Y (y) = dy = 1, 8.5 x dx = 1, 120 y / 33

13 2. Compute the probability that a healthy individual has a cholesterol level between 150 and 200. Solution: This is to compute P(150 Y 200). P(150 Y 200) = dydx = We can also use marginal density to compute this as follows. P(150 Y 200) = dy = / 33

14 Example: Let 1, 0 x 1, 0 y 1. f (x, y) = 0, o.w. Compute the probability that 1 2 X + Y / 33

15 Example: Let e (x+y), x, y 0 f (x, y) = 0, o.w. Compute the probability that Y X / 33

16 Example: In studying the behavior of air support roofs, the random variables X, the inside barometric pressure (in inches of mercury), and Y, the outside pressure, are consdered. Assume that the joint density for (X, Y ) is given by f X,Y (x, y) = c x when 27 y x 33, and 1 c = ln Compute the marginal density functions of X and Y. 16 / 33

17 2. Compute the probability that X 30 and Y / 33

18 Independence Definition: Let X and Y be random variables with joint density f XY and marginal densities f X and f Y, respectively. X and Y are independent if and only if for all x and y. f XY (x, y) = f X (x)f Y (y) 18 / 33

19 Example: Continue with Example on slide no. 4. x/y f X (x) f Y (y) X and Y are not independent, because f XY (x = 0, y = 0) = 0.840, and f X (x = 0)f Y (y = 0) = = f XY (x = 0, y = 0). 19 / 33

20 Example: Continue with Example on slide no. 9 and no. 12. We have already derived the joint pdf and marginal pdf s as follows , 8.5 x 10.5 and 120 y 240; f X,Y (x, y) = 0, o.w. f X (x) = f Y (y) = 1 2, 8.5 x 10.5; 0, o.w , 120 y 240; 0, o.w. Thus, for any pair of (x, y) R 2, f X,Y (x, y) = f X (x)f Y (y). X and Y are independent. 20 / 33

21 Expectation For (X, Y ) discrete, E(X ) = xf X (x) = xf XY (x, y), and x R x R y R E(Y ) = yf Y (y) = yf XY (x, y). y R x R y R For (X, Y ) continuous, E(X ) = xf X (x)dx = xf XY (x, y)dxdy, and x R E(Y ) = y R yf Y (y)dy = x R y R y R x R yf XY (x, y)dxdy. 21 / 33

22 Extension: Let (X, Y ) be a two-dimensional random variable with joint density f XY. Let h(x, Y ) be a random variable. The expected value of h(x, Y ), denoted by E(h(X, Y )), is given by E(h(X, Y )) = h(x, Y )f XY (x, y), x R y R when (X, Y ) is discrete and E(h(X, Y )) exists; or E(h(X, Y )) = h(x, Y )f XY (x, y)dxdy, x R y R when (X, Y ) is continuous and E(h(X, Y )) exists. 22 / 33

23 Example: The joint pdf of X and Y are in the following table. Compute E(X ) and E(X + Y ). x/y f X (x) f Y (y) / 33

24 Example: The joint density for the random variable (X, Y ), where X denotes the calcium level and Y denotes the cholesterol level in the blood of a healthy individual, is given by f X,Y (x, y) = Compute E(X ) and E(XY ) , 8.5 x 10.5 and 120 y 240; 0, o.w. 24 / 33

25 Covariance Definition: Let X and Y be random variables with means µ X and µ Y, respectively. The covariance between X and Y, denoted by Cov(X, Y ) or σ XY is given by Cov(X, Y ) = E((X µ X )(Y µ Y )). Using definition to compute covariance is often inconvenient. Instead we use the following formula to compute covariance. Cov(X, Y ) = E((X µ X )(Y µ Y )) = E(XY ) E(X )E(Y ). 25 / 33

26 Theorem: Let (X, Y ) be a two-dimensional random variable with joint density f XY. If X and Y are independent then E(XY ) = E(X )E(Y ). Thus, If X and Y are independent, then Cov(X, Y ) = E(XY ) E(X )E(Y ) = 0. That is, independence covariance is 0. However, when covariance is 0 for two random variables, it is not always true that they are independent. 26 / 33

27 Example (book page 169): The joint density for X and Y is in the following table. x/y f X (x) 1 0 1/4 1/4 0 1/2 4 1/ /4 1/2 f Y (y) 1/4 1/4 1/4 1/4 1 From the table, we have E(X ) = 5/2, E(Y ) = 0, and E(XY ) = 0, yielding a covariance of 0. However, X and Y are not independent. 27 / 33

28 Correlation Definition: Let X and Y be random variables with means µ X and µ Y and variances σx 2 and σ2 Y, respectively. The correlation, ρ XY, between X and Y is given by ρ XY = Cov(X, Y ) σ 2 X σ 2 Y = Cov(X, Y ) σ X σ Y. This is also called the Pearson coefficient of correlation. 28 / 33

29 Property: The Pearson correlation coefficient ρ XY ranges from -1 to 1, and it measures linearity of X and Y. Specifically, When ρ XY =1, Y can be written as Y = β 0 + β 1 X and β 1 is positive. When ρ XY =-1, Y can be written as Y = β 0 + β 1 X and β 1 is negative. When ρ XY = 0, there is no linear relationship between X and Y, but they could have other relationships. 29 / 33

30 Example: The joint pdf of X and Y are in the following table. Compute ρ XY. x/y f X (x) f Y (y) / 33

31 Conditional Density Definition: Let (X, Y ) be a two-dimensional random variable with joint density f XY and marginal densities f X and Y. Then The conditional density for X given Y = y is given by f X Y =y (x) = f XY (x, y), when f Y (y) > 0. f Y (y) The conditional density for Y given X = x is given by f Y X =x (y) = f XY (x, y), when f X (x) > 0. f X (x) Note: When two r.v. s are independent, the marginal density is the same with the conditional density, i.e. f X Y (x) = f (x). and same for Y. 31 / 33

32 Example: The joint pdf of X and Y are in the following table. Compute P(X 1 y = 0). x/y f X (x) f Y (y) / 33

33 Chapter Summary For both the discrete and continuous cases, know the definition of joint pdf, and use the joint pdf to compute marginal pdf s. Given the joint pdf, be able to compute probability of X and Y satisfying specific requirements, such as P(X + Y 1), P(X < Y < 0.5), etc. Expectations computed from marginal density or joint density. Covariance and independence. Correlation. Relationship of marginal density, conditional density and joint density. 33 / 33

Section 6.1 Joint Distribution Functions

Section 6.1 Joint Distribution Functions Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function

More information

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291)

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291) Feb 8 Homework Solutions Math 5, Winter Chapter 6 Problems (pages 87-9) Problem 6 bin of 5 transistors is known to contain that are defective. The transistors are to be tested, one at a time, until the

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

The Bivariate Normal Distribution

The Bivariate Normal Distribution The Bivariate Normal Distribution This is Section 4.7 of the st edition (2002) of the book Introduction to Probability, by D. P. Bertsekas and J. N. Tsitsiklis. The material in this section was not included

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions Chapter 4 - Lecture 1 Probability Density Functions and Cumulative Distribution Functions October 21st, 2009 Review Probability distribution function Useful results Relationship between the pdf and the

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015.

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015. Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment -3, Probability and Statistics, March 05. Due:-March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x

More information

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1 ECE3 Spring 6 HW5 Solutions February 1, 6 1 Solutions to HW5 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics

More information

Math 432 HW 2.5 Solutions

Math 432 HW 2.5 Solutions Math 432 HW 2.5 Solutions Assigned: 1-10, 12, 13, and 14. Selected for Grading: 1 (for five points), 6 (also for five), 9, 12 Solutions: 1. (2y 3 + 2y 2 ) dx + (3y 2 x + 2xy) dy = 0. M/ y = 6y 2 + 4y N/

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2014 Timo Koski () Mathematisk statistik 24.09.2014 1 / 75 Learning outcomes Random vectors, mean vector, covariance

More information

Correlation in Random Variables

Correlation in Random Variables Correlation in Random Variables Lecture 11 Spring 2002 Correlation in Random Variables Suppose that an experiment produces two random variables, X and Y. What can we say about the relationship between

More information

Statistics 100A Homework 7 Solutions

Statistics 100A Homework 7 Solutions Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

More information

Generating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan. 15.450, Fall 2010

Generating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan. 15.450, Fall 2010 Simulation Methods Leonid Kogan MIT, Sloan 15.450, Fall 2010 c Leonid Kogan ( MIT, Sloan ) Simulation Methods 15.450, Fall 2010 1 / 35 Outline 1 Generating Random Numbers 2 Variance Reduction 3 Quasi-Monte

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Section 3 Part 1. Relationships between two numerical variables

Section 3 Part 1. Relationships between two numerical variables Section 3 Part 1 Relationships between two numerical variables 1 Relationship between two variables The summary statistics covered in the previous lessons are appropriate for describing a single variable.

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Some probability and statistics

Some probability and statistics Appendix A Some probability and statistics A Probabilities, random variables and their distribution We summarize a few of the basic concepts of random variables, usually denoted by capital letters, X,Y,

More information

2WB05 Simulation Lecture 8: Generating random variables

2WB05 Simulation Lecture 8: Generating random variables 2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating

More information

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables Tony Pourmohamad Department of Mathematics De Anza College Spring 2015 Objectives By the end of this set of slides,

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Correlation Coefficient The correlation coefficient is a summary statistic that describes the linear relationship between two numerical variables 2

Correlation Coefficient The correlation coefficient is a summary statistic that describes the linear relationship between two numerical variables 2 Lesson 4 Part 1 Relationships between two numerical variables 1 Correlation Coefficient The correlation coefficient is a summary statistic that describes the linear relationship between two numerical variables

More information

Probability for Estimation (review)

Probability for Estimation (review) Probability for Estimation (review) In general, we want to develop an estimator for systems of the form: x = f x, u + η(t); y = h x + ω(t); ggggg y, ffff x We will primarily focus on discrete time linear

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

table to see that the probability is 0.8413. (b) What is the probability that x is between 16 and 60? The z-scores for 16 and 60 are: 60 38 = 1.

table to see that the probability is 0.8413. (b) What is the probability that x is between 16 and 60? The z-scores for 16 and 60 are: 60 38 = 1. Review Problems for Exam 3 Math 1040 1 1. Find the probability that a standard normal random variable is less than 2.37. Looking up 2.37 on the normal table, we see that the probability is 0.9911. 2. Find

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions Math 370, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0

More information

Microeconomic Theory: Basic Math Concepts

Microeconomic Theory: Basic Math Concepts Microeconomic Theory: Basic Math Concepts Matt Van Essen University of Alabama Van Essen (U of A) Basic Math Concepts 1 / 66 Basic Math Concepts In this lecture we will review some basic mathematical concepts

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is. Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,

More information

Probability Calculator

Probability Calculator Chapter 95 Introduction Most statisticians have a set of probability tables that they refer to in doing their statistical wor. This procedure provides you with a set of electronic statistical tables that

More information

Lecture 9: Introduction to Pattern Analysis

Lecture 9: Introduction to Pattern Analysis Lecture 9: Introduction to Pattern Analysis g Features, patterns and classifiers g Components of a PR system g An example g Probability definitions g Bayes Theorem g Gaussian densities Features, patterns

More information

5 Double Integrals over Rectangular Regions

5 Double Integrals over Rectangular Regions Chapter 7 Section 5 Doule Integrals over Rectangular Regions 569 5 Doule Integrals over Rectangular Regions In Prolems 5 through 53, use the method of Lagrange multipliers to find the indicated maximum

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2014 1 2 3 4 5 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Experiment, outcome, sample space, and

More information

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS

More information

Fundamentals of Probability and Statistics for Reliability. analysis. Chapter 2

Fundamentals of Probability and Statistics for Reliability. analysis. Chapter 2 Chapter 2 Fundamentals of Probability and Statistics for Reliability Analysis Assessment of the reliability of a hydrosystems infrastructural system or its components involves the use of probability and

More information

Econometrics Simple Linear Regression

Econometrics Simple Linear Regression Econometrics Simple Linear Regression Burcu Eke UC3M Linear equations with one variable Recall what a linear equation is: y = b 0 + b 1 x is a linear equation with one variable, or equivalently, a straight

More information

Aggregate Loss Models

Aggregate Loss Models Aggregate Loss Models Chapter 9 Stat 477 - Loss Models Chapter 9 (Stat 477) Aggregate Loss Models Brian Hartman - BYU 1 / 22 Objectives Objectives Individual risk model Collective risk model Computing

More information

Notes on the Negative Binomial Distribution

Notes on the Negative Binomial Distribution Notes on the Negative Binomial Distribution John D. Cook October 28, 2009 Abstract These notes give several properties of the negative binomial distribution. 1. Parameterizations 2. The connection between

More information

Point Biserial Correlation Tests

Point Biserial Correlation Tests Chapter 807 Point Biserial Correlation Tests Introduction The point biserial correlation coefficient (ρ in this chapter) is the product-moment correlation calculated between a continuous random variable

More information

Sections 2.11 and 5.8

Sections 2.11 and 5.8 Sections 211 and 58 Timothy Hanson Department of Statistics, University of South Carolina Stat 704: Data Analysis I 1/25 Gesell data Let X be the age in in months a child speaks his/her first word and

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22 Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability

More information

BA 275 Review Problems - Week 5 (10/23/06-10/27/06) CD Lessons: 48, 49, 50, 51, 52 Textbook: pp. 380-394

BA 275 Review Problems - Week 5 (10/23/06-10/27/06) CD Lessons: 48, 49, 50, 51, 52 Textbook: pp. 380-394 BA 275 Review Problems - Week 5 (10/23/06-10/27/06) CD Lessons: 48, 49, 50, 51, 52 Textbook: pp. 380-394 1. Does vigorous exercise affect concentration? In general, the time needed for people to complete

More information

INSURANCE RISK THEORY (Problems)

INSURANCE RISK THEORY (Problems) INSURANCE RISK THEORY (Problems) 1 Counting random variables 1. (Lack of memory property) Let X be a geometric distributed random variable with parameter p (, 1), (X Ge (p)). Show that for all n, m =,

More information

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page Errata for ASM Exam C/4 Study Manual (Sixteenth Edition) Sorted by Page 1 Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page Practice exam 1:9, 1:22, 1:29, 9:5, and 10:8

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 5. Life annuities. Extract from: Arcones Manual for the SOA Exam MLC. Spring 2010 Edition. available at http://www.actexmadriver.com/ 1/114 Whole life annuity A whole life annuity is a series of

More information

Hypothesis testing - Steps

Hypothesis testing - Steps Hypothesis testing - Steps Steps to do a two-tailed test of the hypothesis that β 1 0: 1. Set up the hypotheses: H 0 : β 1 = 0 H a : β 1 0. 2. Compute the test statistic: t = b 1 0 Std. error of b 1 =

More information

Lecture 5: Mathematical Expectation

Lecture 5: Mathematical Expectation Lecture 5: Mathematical Expectation Assist. Prof. Dr. Emel YAVUZ DUMAN MCB1007 Introduction to Probability and Statistics İstanbul Kültür University Outline 1 Introduction 2 The Expected Value of a Random

More information

Spatial Statistics Chapter 3 Basics of areal data and areal data modeling

Spatial Statistics Chapter 3 Basics of areal data and areal data modeling Spatial Statistics Chapter 3 Basics of areal data and areal data modeling Recall areal data also known as lattice data are data Y (s), s D where D is a discrete index set. This usually corresponds to data

More information

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

More information

1. Let A, B and C are three events such that P(A) = 0.45, P(B) = 0.30, P(C) = 0.35,

1. Let A, B and C are three events such that P(A) = 0.45, P(B) = 0.30, P(C) = 0.35, 1. Let A, B and C are three events such that PA =.4, PB =.3, PC =.3, P A B =.6, P A C =.6, P B C =., P A B C =.7. a Compute P A B, P A C, P B C. b Compute P A B C. c Compute the probability that exactly

More information

Lecture 4: Joint probability distributions; covariance; correlation

Lecture 4: Joint probability distributions; covariance; correlation Lecture 4: Joint probability distributions; covariance; correlation 10 October 2007 In this lecture we ll learn the following: 1. what joint probability distributions are; 2. visualizing multiple variables/joint

More information

1. A survey of a group s viewing habits over the last year revealed the following

1. A survey of a group s viewing habits over the last year revealed the following 1. A survey of a group s viewing habits over the last year revealed the following information: (i) 8% watched gymnastics (ii) 9% watched baseball (iii) 19% watched soccer (iv) 14% watched gymnastics and

More information

STAT 3502. x 0 < x < 1

STAT 3502. x 0 < x < 1 Solution - Assignment # STAT 350 Total mark=100 1. A large industrial firm purchases several new word processors at the end of each year, the exact number depending on the frequency of repairs in the previous

More information

), 35% use extra unleaded gas ( A

), 35% use extra unleaded gas ( A . At a certain gas station, 4% of the customers use regular unleaded gas ( A ), % use extra unleaded gas ( A ), and % use premium unleaded gas ( A ). Of those customers using regular gas, onl % fill their

More information

Example: Boats and Manatees

Example: Boats and Manatees Figure 9-6 Example: Boats and Manatees Slide 1 Given the sample data in Table 9-1, find the value of the linear correlation coefficient r, then refer to Table A-6 to determine whether there is a significant

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS Contents 1. Moment generating functions 2. Sum of a ranom number of ranom variables 3. Transforms

More information

Chapter 3 RANDOM VARIATE GENERATION

Chapter 3 RANDOM VARIATE GENERATION Chapter 3 RANDOM VARIATE GENERATION In order to do a Monte Carlo simulation either by hand or by computer, techniques must be developed for generating values of random variables having known distributions.

More information

arxiv:1202.3071v6 [math.pr] 31 Jan 2014

arxiv:1202.3071v6 [math.pr] 31 Jan 2014 Degree-degree dependencies in random graphs with heavy-tailed degrees Remco van der Hofstad and Nelly Litvak February 3, 204 arxiv:202.307v6 [math.r] 3 Jan 204 Abstract Mixing patterns in large self-organizing

More information

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions) Math 370, Actuarial Problemsolving Spring 008 A.J. Hildebrand Practice Test, 1/8/008 (with solutions) About this test. This is a practice test made up of a random collection of 0 problems from past Course

More information

HYPOTHESIS TESTING: POWER OF THE TEST

HYPOTHESIS TESTING: POWER OF THE TEST HYPOTHESIS TESTING: POWER OF THE TEST The first 6 steps of the 9-step test of hypothesis are called "the test". These steps are not dependent on the observed data values. When planning a research project,

More information

So, using the new notation, P X,Y (0,1) =.08 This is the value which the joint probability function for X and Y takes when X=0 and Y=1.

So, using the new notation, P X,Y (0,1) =.08 This is the value which the joint probability function for X and Y takes when X=0 and Y=1. Joint probabilit is the probabilit that the RVs & Y take values &. like the PDF of the two events, and. We will denote a joint probabilit function as P,Y (,) = P(= Y=) Marginal probabilit of is the probabilit

More information

Probability & Probability Distributions

Probability & Probability Distributions Probability & Probability Distributions Carolyn J. Anderson EdPsych 580 Fall 2005 Probability & Probability Distributions p. 1/61 Probability & Probability Distributions Elementary Probability Theory Definitions

More information

A repeated measures concordance correlation coefficient

A repeated measures concordance correlation coefficient A repeated measures concordance correlation coefficient Presented by Yan Ma July 20,2007 1 The CCC measures agreement between two methods or time points by measuring the variation of their linear relationship

More information

Solution to HW - 1. Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean

Solution to HW - 1. Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean of 81 degree F and a standard deviation of 10 degree F. What is the mean, standard deviation and variance in terms

More information

Principle of Data Reduction

Principle of Data Reduction Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

1 Portfolio mean and variance

1 Portfolio mean and variance Copyright c 2005 by Karl Sigman Portfolio mean and variance Here we study the performance of a one-period investment X 0 > 0 (dollars) shared among several different assets. Our criterion for measuring

More information

Lecture 8. Confidence intervals and the central limit theorem

Lecture 8. Confidence intervals and the central limit theorem Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of

More information

General Sampling Methods

General Sampling Methods General Sampling Methods Reference: Glasserman, 2.2 and 2.3 Claudio Pacati academic year 2016 17 1 Inverse Transform Method Assume U U(0, 1) and let F be the cumulative distribution function of a distribution

More information

Let H and J be as in the above lemma. The result of the lemma shows that the integral

Let H and J be as in the above lemma. The result of the lemma shows that the integral Let and be as in the above lemma. The result of the lemma shows that the integral ( f(x, y)dy) dx is well defined; we denote it by f(x, y)dydx. By symmetry, also the integral ( f(x, y)dx) dy is well defined;

More information

1. First-order Ordinary Differential Equations

1. First-order Ordinary Differential Equations Advanced Engineering Mathematics 1. First-order ODEs 1 1. First-order Ordinary Differential Equations 1.1 Basic concept and ideas 1.2 Geometrical meaning of direction fields 1.3 Separable differential

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability CS 7 Discrete Mathematics and Probability Theory Fall 29 Satish Rao, David Tse Note 8 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

PSTAT 120B Probability and Statistics

PSTAT 120B Probability and Statistics - Week University of California, Santa Barbara April 10, 013 Discussion section for 10B Information about TA: Fang-I CHU Office: South Hall 5431 T Office hour: TBA email: chu@pstat.ucsb.edu Slides will

More information

Outline. Topic 4 - Analysis of Variance Approach to Regression. Partitioning Sums of Squares. Total Sum of Squares. Partitioning sums of squares

Outline. Topic 4 - Analysis of Variance Approach to Regression. Partitioning Sums of Squares. Total Sum of Squares. Partitioning sums of squares Topic 4 - Analysis of Variance Approach to Regression Outline Partitioning sums of squares Degrees of freedom Expected mean squares General linear test - Fall 2013 R 2 and the coefficient of correlation

More information

Lesson 20. Probability and Cumulative Distribution Functions

Lesson 20. Probability and Cumulative Distribution Functions Lesson 20 Probability and Cumulative Distribution Functions Recall If p(x) is a density function for some characteristic of a population, then Recall If p(x) is a density function for some characteristic

More information

1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles...

1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles... MATH4427 Notebook 1 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 1 MATH4427 Notebook 1 3 1.1 Introduction, and Review of Probability

More information

Simple Linear Regression Inference

Simple Linear Regression Inference Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation

More information

1 Sufficient statistics

1 Sufficient statistics 1 Sufficient statistics A statistic is a function T = rx 1, X 2,, X n of the random sample X 1, X 2,, X n. Examples are X n = 1 n s 2 = = X i, 1 n 1 the sample mean X i X n 2, the sample variance T 1 =

More information

Multiple Choice: 2 points each

Multiple Choice: 2 points each MID TERM MSF 503 Modeling 1 Name: Answers go here! NEATNESS COUNTS!!! Multiple Choice: 2 points each 1. In Excel, the VLOOKUP function does what? Searches the first row of a range of cells, and then returns

More information

Martingale Ideas in Elementary Probability. Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996

Martingale Ideas in Elementary Probability. Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996 Martingale Ideas in Elementary Probability Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996 William Faris University of Arizona Fulbright Lecturer, IUM, 1995 1996

More information

16. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION

16. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION 6. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION It is sometimes difficult to directly compute probabilities for a binomial (n, p) random variable, X. We need a different table for each value of

More information

4.3 Areas under a Normal Curve

4.3 Areas under a Normal Curve 4.3 Areas under a Normal Curve Like the density curve in Section 3.4, we can use the normal curve to approximate areas (probabilities) between different values of Y that follow a normal distribution Y

More information

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

Lecture 3: Linear methods for classification

Lecture 3: Linear methods for classification Lecture 3: Linear methods for classification Rafael A. Irizarry and Hector Corrada Bravo February, 2010 Today we describe four specific algorithms useful for classification problems: linear regression,

More information

Reject Inference in Credit Scoring. Jie-Men Mok

Reject Inference in Credit Scoring. Jie-Men Mok Reject Inference in Credit Scoring Jie-Men Mok BMI paper January 2009 ii Preface In the Master programme of Business Mathematics and Informatics (BMI), it is required to perform research on a business

More information

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions 4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

More information

Without data, all you are is just another person with an opinion.

Without data, all you are is just another person with an opinion. OCR Statistics Module Revision Sheet The S exam is hour 30 minutes long. You are allowed a graphics calculator. Before you go into the exam make sureyou are fully aware of the contents of theformula booklet

More information