Properties of Expected values and Variance

Size: px
Start display at page:

Download "Properties of Expected values and Variance"

Transcription

1 Properties of Expected values and Variance Christopher Croke University of Pennsylvania Math 115 UPenn, Fall 2011

2 Expected value Consider a random variable Y = r(x ) for some function r, e.g. Y = X so in this case r(x) = x

3 Expected value Consider a random variable Y = r(x ) for some function r, e.g. Y = X so in this case r(x) = x It turns out (and we have already used) that E(r(X )) = r(x)f (x)dx.

4 Expected value Consider a random variable Y = r(x ) for some function r, e.g. Y = X so in this case r(x) = x It turns out (and we have already used) that E(r(X )) = r(x)f (x)dx. This is not obvious since by definition E(r(X )) = xf Y (x)dx where f Y (x) is the probability density function of Y = r(x ).

5 Expected value Consider a random variable Y = r(x ) for some function r, e.g. Y = X so in this case r(x) = x It turns out (and we have already used) that E(r(X )) = r(x)f (x)dx. This is not obvious since by definition E(r(X )) = xf Y (x)dx where f Y (x) is the probability density function of Y = r(x ). You get from one integral to the other by careful uses of u substitution.

6 Expected value Consider a random variable Y = r(x ) for some function r, e.g. Y = X so in this case r(x) = x It turns out (and we have already used) that E(r(X )) = r(x)f (x)dx. This is not obvious since by definition E(r(X )) = xf Y (x)dx where f Y (x) is the probability density function of Y = r(x ). You get from one integral to the other by careful uses of u substitution. One consequence is E(aX + b) = (ax + b)f (x)dx = ae(x ) + b.

7 Expected value Consider a random variable Y = r(x ) for some function r, e.g. Y = X so in this case r(x) = x It turns out (and we have already used) that E(r(X )) = r(x)f (x)dx. This is not obvious since by definition E(r(X )) = xf Y (x)dx where f Y (x) is the probability density function of Y = r(x ). You get from one integral to the other by careful uses of u substitution. One consequence is E(aX + b) = (ax + b)f (x)dx = ae(x ) + b. (It is not usually the case that E(r(X )) = r(e(x )).)

8 Expected value Consider a random variable Y = r(x ) for some function r, e.g. Y = X so in this case r(x) = x It turns out (and we have already used) that E(r(X )) = r(x)f (x)dx. This is not obvious since by definition E(r(X )) = xf Y (x)dx where f Y (x) is the probability density function of Y = r(x ). You get from one integral to the other by careful uses of u substitution. One consequence is E(aX + b) = (ax + b)f (x)dx = ae(x ) + b. (It is not usually the case that E(r(X )) = r(e(x )).) Similar facts old for discrete random variables.

9 Problem For X the uniform distribution on [0, 2] what is E(X 2 )?

10 Problem For X the uniform distribution on [0, 2] what is E(X 2 )? If X 1, X 2, X 3,...X n are random variables and Y = r(x 1, X 2, X 3,...X n ) then E(Y ) =... r(x 1, x 2, x 3,..., x n )f (x 1, x 2, x 3,..., x n )dx 1 dx 2 dx 3...dx n where f (x 1, x 2, x 3,..., x n ) is the joint probability density function.

11 Problem For X the uniform distribution on [0, 2] what is E(X 2 )? If X 1, X 2, X 3,...X n are random variables and Y = r(x 1, X 2, X 3,...X n ) then E(Y ) =... r(x 1, x 2, x 3,..., x n )f (x 1, x 2, x 3,..., x n )dx 1 dx 2 dx 3...dx n where f (x 1, x 2, x 3,..., x n ) is the joint probability density function. Problem Consider again our example of randomly choosing a point in [0, 1] [0, 1].

12 Problem For X the uniform distribution on [0, 2] what is E(X 2 )? If X 1, X 2, X 3,...X n are random variables and Y = r(x 1, X 2, X 3,...X n ) then E(Y ) =... r(x 1, x 2, x 3,..., x n )f (x 1, x 2, x 3,..., x n )dx 1 dx 2 dx 3...dx n where f (x 1, x 2, x 3,..., x n ) is the joint probability density function. Problem Consider again our example of randomly choosing a point in [0, 1] [0, 1]. We could let X be the random variable of choosing the first coordinate and Y the second. What is E(X + Y )? (note that f (x, y) = 1.)

13 Problem For X the uniform distribution on [0, 2] what is E(X 2 )? If X 1, X 2, X 3,...X n are random variables and Y = r(x 1, X 2, X 3,...X n ) then E(Y ) =... r(x 1, x 2, x 3,..., x n )f (x 1, x 2, x 3,..., x n )dx 1 dx 2 dx 3...dx n where f (x 1, x 2, x 3,..., x n ) is the joint probability density function. Problem Consider again our example of randomly choosing a point in [0, 1] [0, 1]. We could let X be the random variable of choosing the first coordinate and Y the second. What is E(X + Y )? (note that f (x, y) = 1.) Easy properties of expected values: If Pr(X a) = 1 then E(X ) a. If Pr(X b) = 1 then E(X ) b.

14 Properties of E(X ) A little more surprising (but not hard and we have already used): E(X 1 + X 2 + X X n ) = E(X 1 ) + E(X 2 ) + E(X 3 ) E(X n ).

15 Properties of E(X ) A little more surprising (but not hard and we have already used): E(X 1 + X 2 + X X n ) = E(X 1 ) + E(X 2 ) + E(X 3 ) E(X n ). Another way to look at binomial random variables; Let X i be 1 if the i th trial is a success and 0 if a failure.

16 Properties of E(X ) A little more surprising (but not hard and we have already used): E(X 1 + X 2 + X X n ) = E(X 1 ) + E(X 2 ) + E(X 3 ) E(X n ). Another way to look at binomial random variables; Let X i be 1 if the i th trial is a success and 0 if a failure. Note that E(X i ) = 0 q + 1 p = p.

17 Properties of E(X ) A little more surprising (but not hard and we have already used): E(X 1 + X 2 + X X n ) = E(X 1 ) + E(X 2 ) + E(X 3 ) E(X n ). Another way to look at binomial random variables; Let X i be 1 if the i th trial is a success and 0 if a failure. Note that E(X i ) = 0 q + 1 p = p. Our binomial variable (the number of successes) is X = X 1 + X 2 + X X n so E(X ) = E(X 1 ) + E(X 2 ) + E(X 3 ) E(X n ) = np.

18 Properties of E(X ) A little more surprising (but not hard and we have already used): E(X 1 + X 2 + X X n ) = E(X 1 ) + E(X 2 ) + E(X 3 ) E(X n ). Another way to look at binomial random variables; Let X i be 1 if the i th trial is a success and 0 if a failure. Note that E(X i ) = 0 q + 1 p = p. Our binomial variable (the number of successes) is X = X 1 + X 2 + X X n so E(X ) = E(X 1 ) + E(X 2 ) + E(X 3 ) E(X n ) = np. What about products?

19 Properties of E(X ) A little more surprising (but not hard and we have already used): E(X 1 + X 2 + X X n ) = E(X 1 ) + E(X 2 ) + E(X 3 ) E(X n ). Another way to look at binomial random variables; Let X i be 1 if the i th trial is a success and 0 if a failure. Note that E(X i ) = 0 q + 1 p = p. Our binomial variable (the number of successes) is X = X 1 + X 2 + X X n so E(X ) = E(X 1 ) + E(X 2 ) + E(X 3 ) E(X n ) = np. What about products? Only works out well if the random variables are independent. If X 1, X 2, X 3,...X n are independent random variables then: n n E( X i ) = E(X i ). i=1 i=1

20 Problem: Consider independent random variables X 1, X 2, and X 3, where E(X 1 ) = 2, E(X 2 ) = 1, and E(X 3 )=0. Compute E(X 2 1 (X 2 + 3X 3 ) 2 ).

21 Problem: Consider independent random variables X 1, X 2, and X 3, where E(X 1 ) = 2, E(X 2 ) = 1, and E(X 3 )=0. Compute E(X 2 1 (X 2 + 3X 3 ) 2 ). There is not enough information!

22 Problem: Consider independent random variables X 1, X 2, and X 3, where E(X 1 ) = 2, E(X 2 ) = 1, and E(X 3 )=0. Compute E(X 2 1 (X 2 + 3X 3 ) 2 ). There is not enough information! Also assume E(X 2 1 ) = 3, E(X 2 2 ) = 1, and E(X 2 3 ) = 2.

23 Problem: Consider independent random variables X 1, X 2, and X 3, where E(X 1 ) = 2, E(X 2 ) = 1, and E(X 3 )=0. Compute E(X 2 1 (X 2 + 3X 3 ) 2 ). There is not enough information! Also assume E(X 2 1 ) = 3, E(X 2 2 ) = 1, and E(X 2 3 ) = 2. Facts about Var(X ): Var(X ) = 0 means the same as: there is a c such that Pr(X = c) = 1.

24 Problem: Consider independent random variables X 1, X 2, and X 3, where E(X 1 ) = 2, E(X 2 ) = 1, and E(X 3 )=0. Compute E(X 2 1 (X 2 + 3X 3 ) 2 ). There is not enough information! Also assume E(X 2 1 ) = 3, E(X 2 2 ) = 1, and E(X 2 3 ) = 2. Facts about Var(X ): Var(X ) = 0 means the same as: there is a c such that Pr(X = c) = 1. σ 2 (X ) = E(X 2 ) E(X ) 2 (alternative definition)

25 Problem: Consider independent random variables X 1, X 2, and X 3, where E(X 1 ) = 2, E(X 2 ) = 1, and E(X 3 )=0. Compute E(X 2 1 (X 2 + 3X 3 ) 2 ). There is not enough information! Also assume E(X 2 1 ) = 3, E(X 2 2 ) = 1, and E(X 2 3 ) = 2. Facts about Var(X ): Var(X ) = 0 means the same as: there is a c such that Pr(X = c) = 1. σ 2 (X ) = E(X 2 ) E(X ) 2 (alternative definition) σ 2 (ax + b) = a 2 σ 2 (X ).

26 Problem: Consider independent random variables X 1, X 2, and X 3, where E(X 1 ) = 2, E(X 2 ) = 1, and E(X 3 )=0. Compute E(X 2 1 (X 2 + 3X 3 ) 2 ). There is not enough information! Also assume E(X 2 1 ) = 3, E(X 2 2 ) = 1, and E(X 2 3 ) = 2. Facts about Var(X ): Var(X ) = 0 means the same as: there is a c such that Pr(X = c) = 1. σ 2 (X ) = E(X 2 ) E(X ) 2 (alternative definition) σ 2 (ax + b) = a 2 σ 2 (X ). Proof: σ 2 (ax + b) = E[(aX +b (aµ+b)) 2 ]

27 Problem: Consider independent random variables X 1, X 2, and X 3, where E(X 1 ) = 2, E(X 2 ) = 1, and E(X 3 )=0. Compute E(X 2 1 (X 2 + 3X 3 ) 2 ). There is not enough information! Also assume E(X 2 1 ) = 3, E(X 2 2 ) = 1, and E(X 2 3 ) = 2. Facts about Var(X ): Var(X ) = 0 means the same as: there is a c such that Pr(X = c) = 1. σ 2 (X ) = E(X 2 ) E(X ) 2 (alternative definition) σ 2 (ax + b) = a 2 σ 2 (X ). Proof: σ 2 (ax + b) = E[(aX +b (aµ+b)) 2 ] = E[(aX aµ) 2 ]

28 Problem: Consider independent random variables X 1, X 2, and X 3, where E(X 1 ) = 2, E(X 2 ) = 1, and E(X 3 )=0. Compute E(X 2 1 (X 2 + 3X 3 ) 2 ). There is not enough information! Also assume E(X 2 1 ) = 3, E(X 2 2 ) = 1, and E(X 2 3 ) = 2. Facts about Var(X ): Var(X ) = 0 means the same as: there is a c such that Pr(X = c) = 1. σ 2 (X ) = E(X 2 ) E(X ) 2 (alternative definition) σ 2 (ax + b) = a 2 σ 2 (X ). Proof: σ 2 (ax + b) = E[(aX +b (aµ+b)) 2 ] = E[(aX aµ) 2 ] = a 2 E[(X µ) 2 ]

29 Problem: Consider independent random variables X 1, X 2, and X 3, where E(X 1 ) = 2, E(X 2 ) = 1, and E(X 3 )=0. Compute E(X 2 1 (X 2 + 3X 3 ) 2 ). There is not enough information! Also assume E(X 2 1 ) = 3, E(X 2 2 ) = 1, and E(X 2 3 ) = 2. Facts about Var(X ): Var(X ) = 0 means the same as: there is a c such that Pr(X = c) = 1. σ 2 (X ) = E(X 2 ) E(X ) 2 (alternative definition) σ 2 (ax + b) = a 2 σ 2 (X ). Proof: σ 2 (ax + b) = E[(aX +b (aµ+b)) 2 ] = E[(aX aµ) 2 ] = a 2 E[(X µ) 2 ] = a 2 σ 2 (X ).

30 Problem: Consider independent random variables X 1, X 2, and X 3, where E(X 1 ) = 2, E(X 2 ) = 1, and E(X 3 )=0. Compute E(X 2 1 (X 2 + 3X 3 ) 2 ). There is not enough information! Also assume E(X 2 1 ) = 3, E(X 2 2 ) = 1, and E(X 2 3 ) = 2. Facts about Var(X ): Var(X ) = 0 means the same as: there is a c such that Pr(X = c) = 1. σ 2 (X ) = E(X 2 ) E(X ) 2 (alternative definition) σ 2 (ax + b) = a 2 σ 2 (X ). Proof: σ 2 (ax + b) = E[(aX +b (aµ+b)) 2 ] = E[(aX aµ) 2 ] = a 2 E[(X µ) 2 ] = a 2 σ 2 (X ). For independent X 1, X 2, X 3,..., X n σ 2 (X 1 +X 2 +X X n ) = σ 2 (X 1 )+σ 2 (X 2 )+σ 2 (X 3 )+...+σ 2 (X n ).

31 Note that the last statement tells us that σ 2 (a 1 X 1 + a 2 X 2 + a 3 X a n X n ) = = a 2 1σ 2 (X 1 ) + a 2 2σ 2 (X 2 ) + a 2 3σ 2 (X 3 ) a 2 nσ 2 (X n ).

32 Note that the last statement tells us that σ 2 (a 1 X 1 + a 2 X 2 + a 3 X a n X n ) = = a 2 1σ 2 (X 1 ) + a 2 2σ 2 (X 2 ) + a 2 3σ 2 (X 3 ) a 2 nσ 2 (X n ). Now we can compute the variance of the binomial distribution with parameters n and p.

33 Note that the last statement tells us that σ 2 (a 1 X 1 + a 2 X 2 + a 3 X a n X n ) = = a 2 1σ 2 (X 1 ) + a 2 2σ 2 (X 2 ) + a 2 3σ 2 (X 3 ) a 2 nσ 2 (X n ). Now we can compute the variance of the binomial distribution with parameters n and p. As before X = X 1 + X X n where the X i are independent with Pr(X i = 0) = q and Pr(X i = 1) = p.

34 Note that the last statement tells us that σ 2 (a 1 X 1 + a 2 X 2 + a 3 X a n X n ) = = a 2 1σ 2 (X 1 ) + a 2 2σ 2 (X 2 ) + a 2 3σ 2 (X 3 ) a 2 nσ 2 (X n ). Now we can compute the variance of the binomial distribution with parameters n and p. As before X = X 1 + X X n where the X i are independent with Pr(X i = 0) = q and Pr(X i = 1) = p. So µ(x i ) = p

35 Note that the last statement tells us that σ 2 (a 1 X 1 + a 2 X 2 + a 3 X a n X n ) = = a 2 1σ 2 (X 1 ) + a 2 2σ 2 (X 2 ) + a 2 3σ 2 (X 3 ) a 2 nσ 2 (X n ). Now we can compute the variance of the binomial distribution with parameters n and p. As before X = X 1 + X X n where the X i are independent with Pr(X i = 0) = q and Pr(X i = 1) = p. So µ(x i ) = p and σ 2 (X i ) = E(X 2 i ) µ(x i ) 2 =

36 Note that the last statement tells us that σ 2 (a 1 X 1 + a 2 X 2 + a 3 X a n X n ) = = a 2 1σ 2 (X 1 ) + a 2 2σ 2 (X 2 ) + a 2 3σ 2 (X 3 ) a 2 nσ 2 (X n ). Now we can compute the variance of the binomial distribution with parameters n and p. As before X = X 1 + X X n where the X i are independent with Pr(X i = 0) = q and Pr(X i = 1) = p. So µ(x i ) = p and σ 2 (X i ) = E(X 2 i ) µ(x i ) 2 = p p 2 = p(1 p) = pq.

37 Note that the last statement tells us that σ 2 (a 1 X 1 + a 2 X 2 + a 3 X a n X n ) = = a 2 1σ 2 (X 1 ) + a 2 2σ 2 (X 2 ) + a 2 3σ 2 (X 3 ) a 2 nσ 2 (X n ). Now we can compute the variance of the binomial distribution with parameters n and p. As before X = X 1 + X X n where the X i are independent with Pr(X i = 0) = q and Pr(X i = 1) = p. So µ(x i ) = p and σ 2 (X i ) = E(Xi 2 ) µ(x i ) 2 = p p 2 = p(1 p) = pq. Thus σ 2 (X ) = Σσ 2 (X i ) = npq.

38 Consider our random variable Z which is the sum of the coordinates of a point randomly chosen from [0, 1] [0, 1].

39 Consider our random variable Z which is the sum of the coordinates of a point randomly chosen from [0, 1] [0, 1]. Z = X + Y where X and Y both represent choosing a point randomly from [0, 1].

40 Consider our random variable Z which is the sum of the coordinates of a point randomly chosen from [0, 1] [0, 1]. Z = X + Y where X and Y both represent choosing a point randomly from [0, 1]. They are independent and last time we showed σ 2 (X ) = 1 12.

41 Consider our random variable Z which is the sum of the coordinates of a point randomly chosen from [0, 1] [0, 1]. Z = X + Y where X and Y both represent choosing a point randomly from [0, 1]. They are independent and last time we showed σ 2 (X ) = So σ2 (Z) = 1 6.

42 Consider our random variable Z which is the sum of the coordinates of a point randomly chosen from [0, 1] [0, 1]. Z = X + Y where X and Y both represent choosing a point randomly from [0, 1]. They are independent and last time we showed σ 2 (X ) = So σ2 (Z) = 1 6. What is E(XY )?

43 Consider our random variable Z which is the sum of the coordinates of a point randomly chosen from [0, 1] [0, 1]. Z = X + Y where X and Y both represent choosing a point randomly from [0, 1]. They are independent and last time we showed σ 2 (X ) = So σ2 (Z) = 1 6. What is E(XY )? They are independent so E(XY ) = E(X )E(Y ) = = 1 4.

44 Consider our random variable Z which is the sum of the coordinates of a point randomly chosen from [0, 1] [0, 1]. Z = X + Y where X and Y both represent choosing a point randomly from [0, 1]. They are independent and last time we showed σ 2 (X ) = So σ2 (Z) = 1 6. What is E(XY )? They are independent so E(XY ) = E(X )E(Y ) = = 1 4. (σ 2 (XY ) is more complicated.)

45 Why is E( n i=1 X i) = n i=1 E(X i) even when not independent?

46 Why is E( n i=1 X i) = n i=1 E(X i) even when not independent? E( n X i ) = i=1... (x 1 +x x n )f (x 1, x 2,..., x n )dx 1 dx 2...dx n

47 Why is E( n i=1 X i) = n i=1 E(X i) even when not independent? E( n X i ) = i=1 =... n i=1 (x 1 +x x n )f (x 1, x 2,..., x n )dx 1 dx 2...dx n... x i f (x 1, x 2,..., x n )dx 1 dx 2...dx n.

48 Why is E( n i=1 X i) = n i=1 E(X i) even when not independent? E( n X i ) = i=1 =... n i=1 = (x 1 +x x n )f (x 1, x 2,..., x n )dx 1 dx 2...dx n... n i=1 x i f (x 1, x 2,..., x n )dx 1 dx 2...dx n. x i f i (x i )dx i = n E(X i ). i=1

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

Chapter 5. Random variables

Chapter 5. Random variables Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

More information

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22 Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Statistics 100A Homework 7 Solutions

Statistics 100A Homework 7 Solutions Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

More information

WHERE DOES THE 10% CONDITION COME FROM?

WHERE DOES THE 10% CONDITION COME FROM? 1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0

More information

16. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION

16. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION 6. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION It is sometimes difficult to directly compute probabilities for a binomial (n, p) random variable, X. We need a different table for each value of

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

Correlation in Random Variables

Correlation in Random Variables Correlation in Random Variables Lecture 11 Spring 2002 Correlation in Random Variables Suppose that an experiment produces two random variables, X and Y. What can we say about the relationship between

More information

Mathematical Expectation

Mathematical Expectation Mathematical Expectation Properties of Mathematical Expectation I The concept of mathematical expectation arose in connection with games of chance. In its simplest form, mathematical expectation is the

More information

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 1 Solutions

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 1 Solutions Math 70, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the

More information

University of California, Los Angeles Department of Statistics. Random variables

University of California, Los Angeles Department of Statistics. Random variables University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions Math 370, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the

More information

SOLUTIONS: 4.1 Probability Distributions and 4.2 Binomial Distributions

SOLUTIONS: 4.1 Probability Distributions and 4.2 Binomial Distributions SOLUTIONS: 4.1 Probability Distributions and 4.2 Binomial Distributions 1. The following table contains a probability distribution for a random variable X. a. Find the expected value (mean) of X. x 1 2

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

More information

Section 6.1 Discrete Random variables Probability Distribution

Section 6.1 Discrete Random variables Probability Distribution Section 6.1 Discrete Random variables Probability Distribution Definitions a) Random variable is a variable whose values are determined by chance. b) Discrete Probability distribution consists of the values

More information

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is. Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

Section 6.1 Joint Distribution Functions

Section 6.1 Joint Distribution Functions Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function

More information

Lecture 5: Mathematical Expectation

Lecture 5: Mathematical Expectation Lecture 5: Mathematical Expectation Assist. Prof. Dr. Emel YAVUZ DUMAN MCB1007 Introduction to Probability and Statistics İstanbul Kültür University Outline 1 Introduction 2 The Expected Value of a Random

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

APPLIED MATHEMATICS ADVANCED LEVEL

APPLIED MATHEMATICS ADVANCED LEVEL APPLIED MATHEMATICS ADVANCED LEVEL INTRODUCTION This syllabus serves to examine candidates knowledge and skills in introductory mathematical and statistical methods, and their applications. For applications

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such

More information

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away) : Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest

More information

Normal Distribution as an Approximation to the Binomial Distribution

Normal Distribution as an Approximation to the Binomial Distribution Chapter 1 Student Lecture Notes 1-1 Normal Distribution as an Approximation to the Binomial Distribution : Goals ONE TWO THREE 2 Review Binomial Probability Distribution applies to a discrete random variable

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

ISyE 6761 Fall 2012 Homework #2 Solutions

ISyE 6761 Fall 2012 Homework #2 Solutions 1 1. The joint p.m.f. of X and Y is (a) Find E[X Y ] for 1, 2, 3. (b) Find E[E[X Y ]]. (c) Are X and Y independent? ISE 6761 Fall 212 Homework #2 Solutions f(x, ) x 1 x 2 x 3 1 1/9 1/3 1/9 2 1/9 1/18 3

More information

6.1 The Greatest Common Factor; Factoring by Grouping

6.1 The Greatest Common Factor; Factoring by Grouping 386 CHAPTER 6 Factoring and Applications 6.1 The Greatest Common Factor; Factoring by Grouping OBJECTIVES 1 Find the greatest common factor of a list of terms. 2 Factor out the greatest common factor.

More information

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions Math 70/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 2 Solutions About this problem set: These are problems from Course /P actuarial exams that I have collected over the years,

More information

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291)

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291) Feb 8 Homework Solutions Math 5, Winter Chapter 6 Problems (pages 87-9) Problem 6 bin of 5 transistors is known to contain that are defective. The transistors are to be tested, one at a time, until the

More information

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved. 3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately

More information

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random

More information

The Greatest Common Factor; Factoring by Grouping

The Greatest Common Factor; Factoring by Grouping 296 CHAPTER 5 Factoring and Applications 5.1 The Greatest Common Factor; Factoring by Grouping OBJECTIVES 1 Find the greatest common factor of a list of terms. 2 Factor out the greatest common factor.

More information

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution 8 October 2007 In this lecture we ll learn the following: 1. how continuous probability distributions differ

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

6 PROBABILITY GENERATING FUNCTIONS

6 PROBABILITY GENERATING FUNCTIONS 6 PROBABILITY GENERATING FUNCTIONS Certain derivations presented in this course have been somewhat heavy on algebra. For example, determining the expectation of the Binomial distribution (page 5.1 turned

More information

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008 Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the

More information

The Binomial Distribution

The Binomial Distribution The Binomial Distribution James H. Steiger November 10, 00 1 Topics for this Module 1. The Binomial Process. The Binomial Random Variable. The Binomial Distribution (a) Computing the Binomial pdf (b) Computing

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability CS 7 Discrete Mathematics and Probability Theory Fall 29 Satish Rao, David Tse Note 8 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces

More information

2. Discrete random variables

2. Discrete random variables 2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be

More information

Math 202-0 Quizzes Winter 2009

Math 202-0 Quizzes Winter 2009 Quiz : Basic Probability Ten Scrabble tiles are placed in a bag Four of the tiles have the letter printed on them, and there are two tiles each with the letters B, C and D on them (a) Suppose one tile

More information

Normal distribution. ) 2 /2σ. 2π σ

Normal distribution. ) 2 /2σ. 2π σ Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

FACTORING TRINOMIALS IN THE FORM OF ax 2 + bx + c

FACTORING TRINOMIALS IN THE FORM OF ax 2 + bx + c Tallahassee Community College 55 FACTORING TRINOMIALS IN THE FORM OF ax 2 + bx + c This kind of trinomial differs from the previous kind we have factored because the coefficient of x is no longer "1".

More information

CA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction

CA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction CA200 Quantitative Analysis for Business Decisions File name: CA200_Section_04A_StatisticsIntroduction Table of Contents 4. Introduction to Statistics... 1 4.1 Overview... 3 4.2 Discrete or continuous

More information

Lecture 8. Confidence intervals and the central limit theorem

Lecture 8. Confidence intervals and the central limit theorem Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of

More information

How To Find The Correlation Of Random Bits With The Xor Operator

How To Find The Correlation Of Random Bits With The Xor Operator Exclusive OR (XOR) and hardware random number generators Robert B Davies February 28, 2002 1 Introduction The exclusive or (XOR) operation is commonly used to reduce the bias from the bits generated by

More information

Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

More information

THE CENTRAL LIMIT THEOREM TORONTO

THE CENTRAL LIMIT THEOREM TORONTO THE CENTRAL LIMIT THEOREM DANIEL RÜDT UNIVERSITY OF TORONTO MARCH, 2010 Contents 1 Introduction 1 2 Mathematical Background 3 3 The Central Limit Theorem 4 4 Examples 4 4.1 Roulette......................................

More information

Practice Problems #4

Practice Problems #4 Practice Problems #4 PRACTICE PROBLEMS FOR HOMEWORK 4 (1) Read section 2.5 of the text. (2) Solve the practice problems below. (3) Open Homework Assignment #4, solve the problems, and submit multiple-choice

More information

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce

More information

Without data, all you are is just another person with an opinion.

Without data, all you are is just another person with an opinion. OCR Statistics Module Revision Sheet The S exam is hour 30 minutes long. You are allowed a graphics calculator. Before you go into the exam make sureyou are fully aware of the contents of theformula booklet

More information

1 Sufficient statistics

1 Sufficient statistics 1 Sufficient statistics A statistic is a function T = rx 1, X 2,, X n of the random sample X 1, X 2,, X n. Examples are X n = 1 n s 2 = = X i, 1 n 1 the sample mean X i X n 2, the sample variance T 1 =

More information

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions) Math 370, Actuarial Problemsolving Spring 008 A.J. Hildebrand Practice Test, 1/8/008 (with solutions) About this test. This is a practice test made up of a random collection of 0 problems from past Course

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

Martingale Ideas in Elementary Probability. Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996

Martingale Ideas in Elementary Probability. Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996 Martingale Ideas in Elementary Probability Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996 William Faris University of Arizona Fulbright Lecturer, IUM, 1995 1996

More information

Statistics and Probability

Statistics and Probability Mathematics Higher Level Topic 7 Option: Statistics and Probability for the IB Diploma Paul Fannon, Vesna Kadelburg, Ben Woolley and Stephen Ward cambridge university press Cambridge, New York, Melbourne,

More information

STAT 3502. x 0 < x < 1

STAT 3502. x 0 < x < 1 Solution - Assignment # STAT 350 Total mark=100 1. A large industrial firm purchases several new word processors at the end of each year, the exact number depending on the frequency of repairs in the previous

More information

Math/Stats 342: Solutions to Homework

Math/Stats 342: Solutions to Homework Math/Stats 342: Solutions to Homework Steven Miller (sjm1@williams.edu) November 17, 2011 Abstract Below are solutions / sketches of solutions to the homework problems from Math/Stats 342: Probability

More information

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process LECTURE 16 Readings: Section 5.1 Lecture outline Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process Number of successes Distribution of interarrival times The

More information

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 5 Solutions

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 5 Solutions Math 370/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 5 Solutions About this problem set: These are problems from Course 1/P actuarial exams that I have collected over the

More information

1 Short Introduction to Time Series

1 Short Introduction to Time Series ECONOMICS 7344, Spring 202 Bent E. Sørensen January 24, 202 Short Introduction to Time Series A time series is a collection of stochastic variables x,.., x t,.., x T indexed by an integer value t. The

More information

The problems of first TA class of Math144

The problems of first TA class of Math144 The problems of first TA class of Math144 T eaching Assistant : Liu Zhi Date : F riday, F eb 2, 27 Q1. According to the journal Chemical Engineering, an important property of a fiber is its water absorbency.

More information

MAS108 Probability I

MAS108 Probability I 1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

More information

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page Errata for ASM Exam C/4 Study Manual (Sixteenth Edition) Sorted by Page 1 Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page Practice exam 1:9, 1:22, 1:29, 9:5, and 10:8

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2014 1 2 3 4 5 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Experiment, outcome, sample space, and

More information

The Binomial Distribution. Summer 2003

The Binomial Distribution. Summer 2003 The Binomial Distribution Summer 2003 Internet Bubble Several industry experts believe that 30% of internet companies will run out of cash in 6 months and that these companies will find it very hard to

More information

3. The Economics of Insurance

3. The Economics of Insurance 3. The Economics of Insurance Insurance is designed to protect against serious financial reversals that result from random evens intruding on the plan of individuals. Limitations on Insurance Protection

More information

Opgaven Onderzoeksmethoden, Onderdeel Statistiek

Opgaven Onderzoeksmethoden, Onderdeel Statistiek Opgaven Onderzoeksmethoden, Onderdeel Statistiek 1. What is the measurement scale of the following variables? a Shoe size b Religion c Car brand d Score in a tennis game e Number of work hours per week

More information

( ) FACTORING. x In this polynomial the only variable in common to all is x.

( ) FACTORING. x In this polynomial the only variable in common to all is x. FACTORING Factoring is similar to breaking up a number into its multiples. For example, 10=5*. The multiples are 5 and. In a polynomial it is the same way, however, the procedure is somewhat more complicated

More information

Roulette. Math 5 Crew. Department of Mathematics Dartmouth College. Roulette p.1/14

Roulette. Math 5 Crew. Department of Mathematics Dartmouth College. Roulette p.1/14 Roulette p.1/14 Roulette Math 5 Crew Department of Mathematics Dartmouth College Roulette p.2/14 Roulette: A Game of Chance To analyze Roulette, we make two hypotheses about Roulette s behavior. When we

More information

2 Binomial, Poisson, Normal Distribution

2 Binomial, Poisson, Normal Distribution 2 Binomial, Poisson, Normal Distribution Binomial Distribution ): We are interested in the number of times an event A occurs in n independent trials. In each trial the event A has the same probability

More information

Hypothesis Testing for Beginners

Hypothesis Testing for Beginners Hypothesis Testing for Beginners Michele Piffer LSE August, 2011 Michele Piffer (LSE) Hypothesis Testing for Beginners August, 2011 1 / 53 One year ago a friend asked me to put down some easy-to-read notes

More information

Factoring Polynomials and Solving Quadratic Equations

Factoring Polynomials and Solving Quadratic Equations Factoring Polynomials and Solving Quadratic Equations Math Tutorial Lab Special Topic Factoring Factoring Binomials Remember that a binomial is just a polynomial with two terms. Some examples include 2x+3

More information

Normal Approximation. Contents. 1 Normal Approximation. 1.1 Introduction. Anthony Tanbakuchi Department of Mathematics Pima Community College

Normal Approximation. Contents. 1 Normal Approximation. 1.1 Introduction. Anthony Tanbakuchi Department of Mathematics Pima Community College Introductory Statistics Lectures Normal Approimation To the binomial distribution Department of Mathematics Pima Community College Redistribution of this material is prohibited without written permission

More information

4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions 4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

More information

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Microeconomic Theory: Basic Math Concepts

Microeconomic Theory: Basic Math Concepts Microeconomic Theory: Basic Math Concepts Matt Van Essen University of Alabama Van Essen (U of A) Basic Math Concepts 1 / 66 Basic Math Concepts In this lecture we will review some basic mathematical concepts

More information

Recursive Estimation

Recursive Estimation Recursive Estimation Raffaello D Andrea Spring 04 Problem Set : Bayes Theorem and Bayesian Tracking Last updated: March 8, 05 Notes: Notation: Unlessotherwisenoted,x, y,andz denoterandomvariables, f x

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2014 Timo Koski () Mathematisk statistik 24.09.2014 1 / 75 Learning outcomes Random vectors, mean vector, covariance

More information

Solution to HW - 1. Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean

Solution to HW - 1. Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean of 81 degree F and a standard deviation of 10 degree F. What is the mean, standard deviation and variance in terms

More information

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution Recall: Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution A variable is a characteristic or attribute that can assume different values. o Various letters of the alphabet (e.g.

More information

Binomial Sampling and the Binomial Distribution

Binomial Sampling and the Binomial Distribution Binomial Sampling and the Binomial Distribution Characterized by two mutually exclusive events." Examples: GENERAL: {success or failure} {on or off} {head or tail} {zero or one} BIOLOGY: {dead or alive}

More information