Statistics 100A Homework 7 Solutions

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Statistics 100A Homework 7 Solutions"

Transcription

1 Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase a color television set, and 4 percent will just be browsing. If 5 customers enter his store on a given day, what is the probability that he will sell exactly ordinary sets and color set on that day? In this problem, we take a look at a distribution that we have not studied in this course, the multinomial distribution. The PDF for the multinomial distribution is below P (X x,..., X r x r ) n! n!n!... n r! pn pn... pn P n i r The probability that exactly ordinary sets (X ) and plasma set (X ) are purchased (and the rest X 3 are just browsing) is given by P (X, X, X 3 ) 5!!!! (.45) (.5)(.4) A man and a woman agree to meet at a certain location about :3 p.m. If the man arrives at a time uniformly distributed between :5 and :45 and the woman independently arrives at a time uniformly distributed between :3 and pm, find the probability that the man arrives first? The easiest way to illustrate this problem is to consider a geometric representation. Let X be the number of minutes past noon that the man arrives, and let Y represent the number of minutes past noon that the woman arrives. From the problem, we know that 5 X 45 and Y 6. First we find the probability that the first to arrive waits no longer than 5 minutes for the other person. There are two ways to do this: either the man can arrive first and waits for the woman, or vice-verse. First consider the situation that the man arrives first. The man arrives sometime between 5 minutes and 45 minutes past noon. The acceptable time for the woman to arrive is no later than 5 minutes after the man. We can represent this situation geometrically. Given that the man arrives between 5 and 45 minutes after noon, the woman independently arrives some time within the hour. This area is shaded in light grey and has area 8. So that the man does not get impatient, the woman must arrive before 5 minutes have passed. The minutes for the man and woman to arrive such that the man does not become impatient is shaded in black and has area 5 (a parallelogram). The probability that the woman arrives no more than 5 minutes after the man is 5 8.

2 6 5 4 y x + 5 y x Y X Next, consider the situation that the woman arrives first. The woman arrives some time between minutes and 6 minutes past noon. The acceptable time for the man to arrive is no later than 5 minutes after the woman. Given that the woman arrives some time during the hour, the man independently arrives some time between 5 minutes and 45 minutes past the hour. This area is shaded in light grey and has area 8. So that the woman does not get mad, the man must arrive before 5 minutes have passed. The minutes for the man and woman to arrive such that the woman does not get mad is shaded in black and has area 5 as well. The probability that the man arrives no more than 5 minutes after the woman is y x 5 5 y x 4 Y X P ( X Y 5) P (X < Y < X + 5) + P (Y < X < Y + 5) + 6 Notice the probability that the man waits no more than 5 minutes for the woman and viceverse are the same; equally likely. The probability that the man arrives first is.

3 5. The random vector (X, Y ) is said to be uniformly distributed over a region R in the plane if, for some constant c, its joint density is f(x, y) { c if (x, y) R otherwise (a) Show that c area of region R. Proof. Since f(x, y) is a joint density function, Rcdydx. But recall from Math 3B the double integral over a region R is the area of R, call it A(R). Then, cdydx c A(R) R so A(R) c. (b) Suppose that (X, Y ) is is uniformly distributed over the square centered at (, ), whose sides are of length. Show that X and Y are independent, with each being distributed uniformly over (, ). All we need to do is show that X and Y are independent. Proof. Since X and Y are uniformly distributed over the same domain (, ), we know that f X (x) f Y (y) ( ), < x <, < y < From part (a), we know that A(R) c. If the square is centered at (, ) with each side having length, then the area of the square is 4. Therefore, A(R) c 4 and c 4, so then f(x, y) c 4, (x, y) R. We see that f(x, y) 4 f X(x) f Y (y), thus X and Y are independent. (c) What is the probability that that (X, Y ) lies in the circle of radius centered at the origin? That is, find P ( X + Y ). P (X + Y ) cdxdy X +Y c Area of Circle Area of Circle Area of Square since c area of square. The area of the square was 4. The area of the unit circle is πr π. So, P (X + Y ) π 4 3

4 8. Two points are selected randomly on a line of length L so as to be on opposite sides of the midpoint of the line. Find the probability that the distance between the two points is greater than L 3. Let X represent the leftmost point. Its position is uniformly distributed between and L. Let Y represent the rightmost point. Its position is uniformly distributed between L and L. We want to find the probability that P ( Y X > L ) ( ) 3. Note that P X Y > L 3 is the same thing, but since we assigned Y to be the leftmost point, the first representation is also appropriate. Y X> L 3 ( ) ( ) L L L dydx Y X> L 3 4 L dydx We have to be careful here. There are two possible scenarios. First, we can fix X in its domain < x < L and let Y vary within its domain restricted to the condition that the distance between the points is greater than L 3, so the restricted domain would be < x < L 6. P ( Y X > L ) 3 4 L 4 L [ L 6 4 L [ L Y X> L 3 L L dxdy dydx + L L L 7 7 L ] L x+ L 3 dydx ] Show that f(x, y) x, < y < x < is a joint density function. Assuming that f is the joint density function of X, Y, find First we show that f(x, y) is a joint density function. Proof. x x dydx y x x dx dx x (a) the marginal density of Y. f Y (y) y log y x dx log x y 4

5 (b) the marginal density of X. (c) E(X) x f X (x) y x x x dy E(X) x xdx (d) E(Y ) E(Y ) y ( log y) dy y log ydy Integrate by parts: let u log y, dv y so du y y dy, v. [ y ] log y y 4 4. The joint density of X and Y is given by f(x, y) { xe (x+y) x >, y > otherwise Are X and Y independent? What if f(x, y) were given by { < x < y, < y < f(x, y) otherwise X and Y are independent if f(x, y) f X (x)f Y (y). We already have f(x, y) so we find the marginals. f X (x) xe (x+y) dy xe (x+y) 5

6 xe x and, f Y (y) xe (x+y) dx [xe (x+y) + e (x+y)] [ ] e (x+y) (x + ) [ e y] e y Yes, X and Y are independent. f X (x) f Y (y) xe x e y xe (x+y) f(x, y) For the other PDF given, we do the same thing. f X (x) x dy y x x ( x) and f Y (y) y x y y dx No, X and Y are not independent. f X (x) f Y (y) ( x) y f(x, y) 6

7 3. The random variables X and Y have joint density function, { xy( x) < x <, < y < f(x, y) otherwise Before considering the problems that follow, let s find the marginal densities of X and Y. We want f X (x) to be in terms of x, so integrate out y and choose limits such that the result is in terms of x. Here we can simply use the domain of x. f X (x) x( x) xy( x)dy [ y ( x)x 6x( x) ydy ] We want f Y (y) to be in terms of y, so integrate out x and choose limits such that the result is in terms of y. Here we can simply use the domain of y. f Y (y) y xy( x)dx x x dx [ x y x3 3 ( ) y 6 y ] (a) Are X and Y independent? y Again, we check whether or not f(x, y) f X (x) f Y (y). Yes, X and Y are independent. (b) Find E(X). f X (x) f Y (y) xy( x) f(x, y) E(X) 6 b a xf X (x)dx 6x ( x)dx x x 3 dx 7

8 [ x x4 4 [ 6 3 ] 4 ] (c) Find E(Y ). E(Y ) 3 y3 3 yf Y (y)dy y dy (d) Find Var (X). Recall that Var (X) E(X ) E(X) We already have E(X). To find E(X ) (the second moment), we compute E(X ) b a x f X (x)dx E(X ) 6 6x 3 ( x)dx x 3 x 4 dx [ x x5 5 [ 6 4 ] 5 3 ] and we already know that E(X), so Var (X) E(X ) (X) 3 ( ) 8

9 3 4 (e) Find Var (Y ). Again, Var (Y ) E(Y ) E(Y ). Var (Y ) E(Y ) E(Y ) ( y 3 dy 3 y ) 9

10 8. If X and X are independent exponential random variables with respective parameters λ and λ, find the distribution of Z X X. Also compute P (X < X ). First, note that f xy (x, y) λ λ e λ x e λ x We start with the CDF for Z X X and take the derivative to get f Z. Take a X X. Then, F Z (a) ( ) X P a X P (X ax ) ax λ λ λ e λ x e λ x dx dx [ λ λ e λ x ] ax e λ x λ e λ x ( e λ ax ) dx λ e λ x e x (λ +λ a) dx [ λ λ + λ a e y(λ +λ a) e λ x [ ] λ λ + λ λ λ + λ a λ a λ + λ a dx ] Then, Next, P (X < X ) P f Z (a) F Z(a) df da λ λ (λ a + λ ) ( ) X < X λ λ +λ

11 3. The expected number of typographical errors on a page of a certain magazine is.. What is the probability that an article of pages contains We know that the error rate is. per page. Let X i be the number of errors on page i. Then X i is Poisson with rate λ i.. We are working with a sum of independent Poisson random variables. The sum, X i X i Poisson(λ i λ i (.) ). Then, (a) typographical errors. P (X x) x e x! From the above, P (X ) e! e. (b) or more typographical errors. P (X ) P (X ) P (X ) e e! e e 3e The monthly worldwide average number of airplane crashes of commercial airlines is.. What is the probability that there will be Again, we have a monthly average rate of plane crashes. Assuming the probability of a plane crash is very small, we use the Poisson distribution. For one month, λ.. Let X be the number of airplane crashes in one month. So, P (X x ).x e. x! (a) more than such accidents in the next month? We are still considering only one month. So, P (X > ) P (X ) P (X ) P (X ) e..e.. e. e. [ e..53 ]

12 (b) more than 4 such accidents in the next months? Now we are considering the next two months. We are working with the sum of two independent Poisson random variables. Let X be the number of airplane crashes in the second month and let X be the number of airplane crashes in two months, such that X X + X. X follows the same distribution with the same mean as X. Then, X is a Poisson random variable with λ λ + λ 4.4. Why? Then, E(X) E(X + X ) E(X ) + E(X ) λ + λ P (X > 4) P (X ) P (X ) P (X ) P (X 3) P (X 4) [ ] e (4.4) + (4.4)3 + (4.4)4! 3! 4!.55e (c) more than 5 such accidents in the next 3 months. Now we are considering the next three months. We are working with the sum of three independent Poisson random variables. Let X 3 be the number of airplane crashes in the third month and let X be the number of airplane crashes in three months, such that X X + X + X 3. X 3 follows the same distribution with the same mean as X and X. Then, X is a Poisson random variable with λ λ + λ + λ Then, P (X > 5) P (X ) P (X ) P (X ) P (X 3) P (X 4) P (X 5) [ ] e (6.6) + (6.6)3 + (6.6)4 + (6.6)5! 3! 4! 5! 6.7e The gross weekly sales at a certain restaurant is a normal random varaible with mean $ and standard deviation $3. What is the probability that (a) the total gross sales over the next weeks exceeds $5? Let X be the restaurant s profit for week. Let X be the total gross sales during week. We want to find the probability that the total gross sales over the next weeks exceeds $5 so we need a new random variable to represent the total gross sales over two weeks. We assume that X and X follow the same distribution with the same mean µ and standard deviation σ and note they are independent. Let X be the sum of two normal random variables. That is X X + X. Then, we need to find the mean and SD of this new random variable. E(X) E(X + X ) E(X ) + E(X ) µ + µ µ () 44 σ X Var (X) Var (X + X ) Var (X ) + Var (X ) σ σ (3) 46 Thus, X N(44, 46).

13 P (X > 5) P (X 5) ( ) 5 44 P Z 46 P (Z.845) (b) weekly sales exceed $ in at least of the next 3 weeks? This is a two-step problem. We want to find the probability that sales exceed the given amount in at least of the 3 next weeks, so we will want to use the binomial distribution with n 3. The first step involves finding p. Find p. To find p, consider the event that weekly sales exceed $ in an arbitrary week. Let X be the total gross sales in some arbitrary week. Then X N(, 3), and, p P (X > ) P (X ) ( ) P Z 3 P (Z <.87).9.88 Find P (Y ). Let Y be the number of weeks that total gross sales exceed $. Then, P (Y ) P (Y ) + P (Y 3) ( ) ( 3 (.88) 3 (.9) ) (.88) According to the U.S. National Center for Health Statistics, 5. percent of males and 3.6 percent of females never eat breakfast. Suppose that random samples of men and women are chosen. Approximate the probability that This is a binomial problem, but since n is large and we are asked for an approximation, we use the normal approximation to the binomial. Let X be the number of men that never eat breakfast and let Y be the number of women that never eat breakfast. Note that n for both the men and the women. Then, E(X) np X E(Y ) np Y

14 Var (X) np X ( p X ) 37.7 σ X 6.4 Var (Y ) np Y ( p Y ) 36.6 σ Y 6 (a) at least of those 4 people never eat breakfast. In this part we collapse the men and women into one group. That is, we create a new variable W X + Y. We need to find the distribution of W. E(W ) E(X + Y ) E(X) + E(Y ) σ W Var (X + Y ) Var (X) + Var (Y ) So W N(97.6, 8.59). Then, P (W ) P (W < ) ( ) P Z < 8.59 P (Z <.385) (b) the number of the women who never eat breakfast is at least as large as the number of the men who never eat breakfast. We want to find P (Y X) P (Y X ). We need to find the distribution of Y X. Note that the standard deviation of X + Y and Y X is the same. Then, E(Y X) E(Y ) E(X) P (Y X ) ( ) P Z 8.59 P (Z.343) P (Z <.343)

15 35. In Problem, calculate the conditional probability mass function of X given that (a) X We want to find the conditional probability of X given that the second ball chosen is white (X ). We must calculate this for both parts of the original problem. We really only need to find the probability for X since the probability of X is just the complement. i. P (X X ) P (X,X ) P (X ) ii. P (X X ) (b) X i. P (X X ) P (X,X ) P (X ) ii. P (X X ) P (X,X ) P (X ) p(,) p(,)+p(,) p(,,)+p(,,) p(,,)+p(,,)+p(,,)+p(,,) 3 p(,) p(,)+p(,).47 p(,,)+p(,,) p(,,)+p(,,)+p(,,)+p(,,) Choose a number X at random from the set of numbers {,, 3, 4, 5}. Now choose a number at random from the subset no larger than X, that is, from {,, X}. Call this second number Y. (a) Find the joint mass function of X and Y. Note that this sampling method has two steps. Once we draw a value for X, then we draw a value of Y. That is, Y is conditional on X. Then, by Bayes rule, P Y X (x y) P (x, y) P X (x) Then, P (x, y) P X Y (x y)p X (x) Start with P X (x i). In the set {,, 3, 4, 5} each number is equally likely to be chosen for the value of X, so P X (x i) 5. Once we know which value is chosen for X, we choose a value for Y that is not greater than X. Each number in the subset is equally likely, so P Y X (y j x i) j, < j < i, < i < 5 Then, P (x, y) 5j, < j < i, < i < 5. (b) Find the conditional mass function of X given that Y i. Do it for i,, 3, 4, 5. P X Y (X j Y i) P (X j, Y i) P Y (Y i) To find the marginal, think about the original problem. If Y, this means that X by the specification of the problem, so the probability of drawing X given 5

16 that Y is. For Y, X can be or. The probability of drawing X j given that Y is 5 etc. So, Then, P Y (Y i) 5 k k 5 P X Y (X j Y i) 5j P 5 ki k 5 (c) Are X and Y independent? Why? No, if they were independent, this would not be an exercise. X and Y are not independent because the choice of Y is restricted by the choice of X. P (X j Y i) P (X j) because if they were equal, that would mean that the probability of X being some value does not at all depend on the value of Y, assuming Y is known. 4. The joint probability mass function of X and Y is given by p(, ) 8, p(, ) 4, p(, ) 8, p(, ) (a) Compute the conditional mass function of X given Y i, i,. P (X Y ) P (X, Y ) P (Y ) p(, ) p(, ) + p(, ) P (X Y ) P (X Y ) P (X Y ) P (X, Y ) P (Y ) p(, ) p(, ) + p(, ) P (X Y ) P (X Y ) 3 (b) Are X and Y independent? Check if P (X j, Y i) P (X j)p (X i). Suppose i j. P (X, Y ) p(, ) P (X ) p(, ) + p(, ) P (Y ) p(, ) + p(, ) Note that No. 6

17 (c) Compute the following... P (X + Y > ) p(, ) + p(, ) + p(, ) P (XY 3) p(, ) + p(, ) + p(, ) ( ) X P Y > P (X > Y ) p(, ) 8 7

6. Jointly Distributed Random Variables

6. Jointly Distributed Random Variables 6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.

More information

STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case

STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case Pengyuan (Penelope) Wang June 20, 2011 Joint density function of continuous Random Variable When X and Y are two continuous

More information

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291)

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291) Feb 8 Homework Solutions Math 5, Winter Chapter 6 Problems (pages 87-9) Problem 6 bin of 5 transistors is known to contain that are defective. The transistors are to be tested, one at a time, until the

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

MTH135/STA104: Probability

MTH135/STA104: Probability MTH135/STA14: Probability Homework # 8 Due: Tuesday, Nov 8, 5 Prof Robert Wolpert 1 Define a function f(x, y) on the plane R by { 1/x < y < x < 1 f(x, y) = other x, y a) Show that f(x, y) is a joint probability

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0

More information

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions... MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 2004-2012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................

More information

Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

More information

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1 ECE32 Spring 26 HW7 Solutions March, 26 Solutions to HW7 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

ST 371 (VIII): Theory of Joint Distributions

ST 371 (VIII): Theory of Joint Distributions ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is. Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

Section 6.1 Joint Distribution Functions

Section 6.1 Joint Distribution Functions Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function

More information

Covariance and Correlation. Consider the joint probability distribution f XY (x, y).

Covariance and Correlation. Consider the joint probability distribution f XY (x, y). Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 2: Section 5-2 Covariance and Correlation Consider the joint probability distribution f XY (x, y). Is there a relationship between X and Y? If so, what kind?

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Chapter 4. Multivariate Distributions

Chapter 4. Multivariate Distributions 1 Chapter 4. Multivariate Distributions Joint p.m.f. (p.d.f.) Independent Random Variables Covariance and Correlation Coefficient Expectation and Covariance Matrix Multivariate (Normal) Distributions Matlab

More information

P(a X b) = f X (x)dx. A p.d.f. must integrate to one: f X (x)dx = 1. Z b

P(a X b) = f X (x)dx. A p.d.f. must integrate to one: f X (x)dx = 1. Z b Continuous Random Variables The probability that a continuous random variable, X, has a value between a and b is computed by integrating its probability density function (p.d.f.) over the interval [a,b]:

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

MATH 201. Final ANSWERS August 12, 2016

MATH 201. Final ANSWERS August 12, 2016 MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6-sided dice, five 8-sided dice, and six 0-sided dice. A die is drawn from the bag and then rolled.

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015.

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015. Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment -3, Probability and Statistics, March 05. Due:-March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x

More information

Some special discrete probability distributions

Some special discrete probability distributions University of California, Los Angeles Department of Statistics Statistics 100A Instructor: Nicolas Christou Some special discrete probability distributions Bernoulli random variable: It is a variable that

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

4. Joint Distributions of Two Random Variables

4. Joint Distributions of Two Random Variables 4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint

More information

Exponential Distribution

Exponential Distribution Exponential Distribution Definition: Exponential distribution with parameter λ: { λe λx x 0 f(x) = 0 x < 0 The cdf: F(x) = x Mean E(X) = 1/λ. f(x)dx = Moment generating function: φ(t) = E[e tx ] = { 1

More information

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i ) Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

More information

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random

More information

Topic 4: Multivariate random variables. Multiple random variables

Topic 4: Multivariate random variables. Multiple random variables Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly

More information

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf AMS 3 Joe Mitchell Eamples: Joint Densities and Joint Mass Functions Eample : X and Y are jointl continuous with joint pdf f(,) { c 2 + 3 if, 2, otherwise. (a). Find c. (b). Find P(X + Y ). (c). Find marginal

More information

), 35% use extra unleaded gas ( A

), 35% use extra unleaded gas ( A . At a certain gas station, 4% of the customers use regular unleaded gas ( A ), % use extra unleaded gas ( A ), and % use premium unleaded gas ( A ). Of those customers using regular gas, onl % fill their

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Chapter 4 Expected Values

Chapter 4 Expected Values Chapter 4 Expected Values 4. The Expected Value of a Random Variables Definition. Let X be a random variable having a pdf f(x). Also, suppose the the following conditions are satisfied: x f(x) converges

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

WHERE DOES THE 10% CONDITION COME FROM?

WHERE DOES THE 10% CONDITION COME FROM? 1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

More information

Applications of the Double Integral

Applications of the Double Integral Applications of the Double Integral Mass Density of a Laminate The double integral has many interpretations other than volume. In this section, we examine several of those di erent interpretations. Many

More information

Recitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere

Recitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere Recitation. Exercise 3.5: If the joint probability density of X and Y is given by xy for < x

More information

Joint Distributions. Tieming Ji. Fall 2012

Joint Distributions. Tieming Ji. Fall 2012 Joint Distributions Tieming Ji Fall 2012 1 / 33 X : univariate random variable. (X, Y ): bivariate random variable. In this chapter, we are going to study the distributions of bivariate random variables

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

Joint distributions Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014

Joint distributions Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014 Joint distributions Math 17 Probability and Statistics Prof. D. Joyce, Fall 14 Today we ll look at joint random variables and joint distributions in detail. Product distributions. If Ω 1 and Ω are sample

More information

Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m. - 10:45 a.m.)

Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m. - 10:45 a.m.) Name: Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m. - 10:45 a.m.) The total score is 100 points. Instructions: There are six questions. Each one is worth 20 points. TA will grade the best five

More information

( ) = 1 x. ! 2x = 2. The region where that joint density is positive is indicated with dotted lines in the graph below. y = x

( ) = 1 x. ! 2x = 2. The region where that joint density is positive is indicated with dotted lines in the graph below. y = x Errata for the ASM Study Manual for Exam P, Eleventh Edition By Dr. Krzysztof M. Ostaszewski, FSA, CERA, FSAS, CFA, MAAA Web site: http://www.krzysio.net E-mail: krzysio@krzysio.net Posted September 21,

More information

Binomial random variables

Binomial random variables Binomial and Poisson Random Variables Solutions STAT-UB.0103 Statistics for Business Control and Regression Models Binomial random variables 1. A certain coin has a 5% of landing heads, and a 75% chance

More information

Continuous Probability Distributions (120 Points)

Continuous Probability Distributions (120 Points) Economics : Statistics for Economics Problem Set 5 - Suggested Solutions University of Notre Dame Instructor: Julio Garín Spring 22 Continuous Probability Distributions 2 Points. A management consulting

More information

Stats on the TI 83 and TI 84 Calculator

Stats on the TI 83 and TI 84 Calculator Stats on the TI 83 and TI 84 Calculator Entering the sample values STAT button Left bracket { Right bracket } Store (STO) List L1 Comma Enter Example: Sample data are {5, 10, 15, 20} 1. Press 2 ND and

More information

Solutions to Final Practice Problems

Solutions to Final Practice Problems s to Final Practice Problems Math March 5, Change the Cartesian integral into an equivalent polar integral and evaluate: I 5 x 5 5 x ( x + y ) dydx The domain of integration for this integral is D {(x,

More information

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 3-7, 3-8 The remaining discrete random

More information

Statistics 100A Homework 3 Solutions

Statistics 100A Homework 3 Solutions Chapter Statistics 00A Homework Solutions Ryan Rosario. Two balls are chosen randomly from an urn containing 8 white, black, and orange balls. Suppose that we win $ for each black ball selected and we

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Chapter 4 Statistics 00A Homework 4 Solutions Ryan Rosario 39. A ball is drawn from an urn containing 3 white and 3 black balls. After the ball is drawn, it is then replaced and another ball is drawn.

More information

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008 Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the

More information

sheng@mail.ncyu.edu.tw 1 Content Introduction Expectation and variance of continuous random variables Normal random variables Exponential random variables Other continuous distributions The distribution

More information

Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

More information

Chapter 6 Continuous Probability Distributions

Chapter 6 Continuous Probability Distributions Continuous Probability Distributions Learning Objectives 1. Understand the difference between how probabilities are computed for discrete and continuous random variables. 2. Know how to compute probability

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

More information

Aggregate Loss Models

Aggregate Loss Models Aggregate Loss Models Chapter 9 Stat 477 - Loss Models Chapter 9 (Stat 477) Aggregate Loss Models Brian Hartman - BYU 1 / 22 Objectives Objectives Individual risk model Collective risk model Computing

More information

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1 ECE3 Spring 6 HW5 Solutions February 1, 6 1 Solutions to HW5 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics

More information

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions) Math 370, Actuarial Problemsolving Spring 008 A.J. Hildebrand Practice Test, 1/8/008 (with solutions) About this test. This is a practice test made up of a random collection of 0 problems from past Course

More information

Name: Math 29 Probability. Practice Second Midterm Exam 1. 1. Show all work. You may receive partial credit for partially completed problems.

Name: Math 29 Probability. Practice Second Midterm Exam 1. 1. Show all work. You may receive partial credit for partially completed problems. Name: Math 29 Probability Practice Second Midterm Exam 1 Instructions: 1. Show all work. You may receive partial credit for partially completed problems. 2. You may use calculators and a one-sided sheet

More information

STAB47S:2003 Midterm Name: Student Number: Tutorial Time: Tutor:

STAB47S:2003 Midterm Name: Student Number: Tutorial Time: Tutor: STAB47S:200 Midterm Name: Student Number: Tutorial Time: Tutor: Time: 2hours Aids: The exam is open book Students may use any notes, books and calculators in writing this exam Instructions: Show your reasoning

More information

4. Joint Distributions

4. Joint Distributions Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose

More information

( ) = P Z > = P( Z > 1) = 1 Φ(1) = 1 0.8413 = 0.1587 P X > 17

( ) = P Z > = P( Z > 1) = 1 Φ(1) = 1 0.8413 = 0.1587 P X > 17 4.6 I company that manufactures and bottles of apple juice uses a machine that automatically fills 6 ounce bottles. There is some variation, however, in the amounts of liquid dispensed into the bottles

More information

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions Math 370, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the

More information

Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

More information

Definition: Let S be a solid with cross-sectional area A(x) perpendicular to the x-axis at each point x [a, b]. The volume of S is V = A(x)dx.

Definition: Let S be a solid with cross-sectional area A(x) perpendicular to the x-axis at each point x [a, b]. The volume of S is V = A(x)dx. Section 7.: Volume Let S be a solid and suppose that the area of the cross-section of S in the plane P x perpendicular to the x-axis passing through x is A(x) for a x b. Consider slicing the solid into

More information

INSURANCE RISK THEORY (Problems)

INSURANCE RISK THEORY (Problems) INSURANCE RISK THEORY (Problems) 1 Counting random variables 1. (Lack of memory property) Let X be a geometric distributed random variable with parameter p (, 1), (X Ge (p)). Show that for all n, m =,

More information

Chapter 2: Data quantifiers: sample mean, sample variance, sample standard deviation Quartiles, percentiles, median, interquartile range Dot diagrams

Chapter 2: Data quantifiers: sample mean, sample variance, sample standard deviation Quartiles, percentiles, median, interquartile range Dot diagrams Review for Final Chapter 2: Data quantifiers: sample mean, sample variance, sample standard deviation Quartiles, percentiles, median, interquartile range Dot diagrams Histogram Boxplots Chapter 3: Set

More information

Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304. jones/courses/141

Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304.  jones/courses/141 Math 141 Lecture 7: Variance, Covariance, and Sums Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Last Time Variance: expected squared deviation from the mean: Standard

More information

University of California, Berkeley, Statistics 134: Concepts of Probability

University of California, Berkeley, Statistics 134: Concepts of Probability University of California, Berkeley, Statistics 134: Concepts of Probability Michael Lugo, Spring 211 Exam 2 solutions 1. A fair twenty-sided die has its faces labeled 1, 2, 3,..., 2. The die is rolled

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

MATH2740: Environmental Statistics

MATH2740: Environmental Statistics MATH2740: Environmental Statistics Lecture 6: Distance Methods I February 10, 2016 Table of contents 1 Introduction Problem with quadrat data Distance methods 2 Point-object distances Poisson process case

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability CS 7 Discrete Mathematics and Probability Theory Fall 29 Satish Rao, David Tse Note 8 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces

More information

Chapter 3 RANDOM VARIATE GENERATION

Chapter 3 RANDOM VARIATE GENERATION Chapter 3 RANDOM VARIATE GENERATION In order to do a Monte Carlo simulation either by hand or by computer, techniques must be developed for generating values of random variables having known distributions.

More information

Solution Using the geometric series a/(1 r) = x=1. x=1. Problem For each of the following distributions, compute

Solution Using the geometric series a/(1 r) = x=1. x=1. Problem For each of the following distributions, compute Math 472 Homework Assignment 1 Problem 1.9.2. Let p(x) 1/2 x, x 1, 2, 3,..., zero elsewhere, be the pmf of the random variable X. Find the mgf, the mean, and the variance of X. Solution 1.9.2. Using the

More information

Worked examples Multiple Random Variables

Worked examples Multiple Random Variables Worked eamples Multiple Random Variables Eample Let X and Y be random variables that take on values from the set,, } (a) Find a joint probability mass assignment for which X and Y are independent, and

More information

Chapter 3 Joint Distributions

Chapter 3 Joint Distributions Chapter 3 Joint Distributions 3.6 Functions of Jointly Distributed Random Variables Discrete Random Variables: Let f(x, y) denote the joint pdf of random variables X and Y with A denoting the two-dimensional

More information

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Discrete vs. continuous random variables Examples of continuous distributions o Uniform o Exponential o Normal Recall: A random

More information

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

More information

STAT 830 Convergence in Distribution

STAT 830 Convergence in Distribution STAT 830 Convergence in Distribution Richard Lockhart Simon Fraser University STAT 830 Fall 2011 Richard Lockhart (Simon Fraser University) STAT 830 Convergence in Distribution STAT 830 Fall 2011 1 / 31

More information

Binomial random variables (Review)

Binomial random variables (Review) Poisson / Empirical Rule Approximations / Hypergeometric Solutions STAT-UB.3 Statistics for Business Control and Regression Models Binomial random variables (Review. Suppose that you are rolling a die

More information

6 PROBABILITY GENERATING FUNCTIONS

6 PROBABILITY GENERATING FUNCTIONS 6 PROBABILITY GENERATING FUNCTIONS Certain derivations presented in this course have been somewhat heavy on algebra. For example, determining the expectation of the Binomial distribution (page 5.1 turned

More information

Math 202-0 Quizzes Winter 2009

Math 202-0 Quizzes Winter 2009 Quiz : Basic Probability Ten Scrabble tiles are placed in a bag Four of the tiles have the letter printed on them, and there are two tiles each with the letters B, C and D on them (a) Suppose one tile

More information

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS

More information

Name: Math 29 Probability. Practice Second Midterm Exam 1

Name: Math 29 Probability. Practice Second Midterm Exam 1 Name: Math 9 Probability Practice Second Midterm Exam 1 nstructions: 1 Show all work You may receive partial credit for partially completed problems You may use calculators and a onesided sheet of reference

More information

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

More information

Lecture 8: More Continuous Random Variables

Lecture 8: More Continuous Random Variables Lecture 8: More Continuous Random Variables 26 September 2005 Last time: the eponential. Going from saying the density e λ, to f() λe λ, to the CDF F () e λ. Pictures of the pdf and CDF. Today: the Gaussian

More information

Bivariate Distributions

Bivariate Distributions Chapter 4 Bivariate Distributions 4.1 Distributions of Two Random Variables In many practical cases it is desirable to take more than one measurement of a random observation: (brief examples) 1. What is

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Math 1B, lecture 5: area and volume

Math 1B, lecture 5: area and volume Math B, lecture 5: area and volume Nathan Pflueger 6 September 2 Introduction This lecture and the next will be concerned with the computation of areas of regions in the plane, and volumes of regions in

More information

Geometry Chapter 9 Extending Perimeter, Circumference, and Area

Geometry Chapter 9 Extending Perimeter, Circumference, and Area Geometry Chapter 9 Extending Perimeter, Circumference, and Area Lesson 1 Developing Formulas for Triangles and Quadrilaterals Learning Targets LT9-1: Solve problems involving the perimeter and area of

More information

MATH4427 Notebook 2 Spring 2016. 2 MATH4427 Notebook 2 3. 2.1 Definitions and Examples... 3. 2.2 Performance Measures for Estimators...

MATH4427 Notebook 2 Spring 2016. 2 MATH4427 Notebook 2 3. 2.1 Definitions and Examples... 3. 2.2 Performance Measures for Estimators... MATH4427 Notebook 2 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 2 MATH4427 Notebook 2 3 2.1 Definitions and Examples...................................

More information

Solutions for Review Problems for Exam 2 Math 1040 1 1. You roll two fair dice. (a) Draw a tree diagram for this experiment.

Solutions for Review Problems for Exam 2 Math 1040 1 1. You roll two fair dice. (a) Draw a tree diagram for this experiment. Solutions for Review Problems for Exam 2 Math 1040 1 1. You roll two fair dice. (a) Draw a tree diagram for this experiment. 1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6 1 2

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,

More information

Random Variables, Expectation, Distributions

Random Variables, Expectation, Distributions Random Variables, Expectation, Distributions CS 5960/6960: Nonparametric Methods Tom Fletcher January 21, 2009 Review Random Variables Definition A random variable is a function defined on a probability

More information