EE 322: Probabilistic Methods for Electrical Engineers. Zhengdao Wang Department of Electrical and Computer Engineering Iowa State University

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "EE 322: Probabilistic Methods for Electrical Engineers. Zhengdao Wang Department of Electrical and Computer Engineering Iowa State University"

Transcription

1 EE 322: Probabilistic Methods for Electrical Engineers Zhengdao Wang Department of Electrical and Computer Engineering Iowa State University Discrete Random Variables 1

2 Introduction to Random Variables Intuitively: A random variable takes a value every time an experiment is performed. Examples: the number of cars that passes by within a minute Example: the height of a randomly picked student The voltage of a received signal from antenna at a certain time Two kinds of random variables: discrete and continuous 2

3 PMF: Probability Mass Function Example: coin flipping. Getting Head and Tail with 1/2 chance each. Map H to 0 and T to 1. Denote the result X. P (X = 0) = P (getting a Head) = 1/2. P (X = 1) = P (getting a Tail) = 1/2. PMF of X

4 PMF: 6-sided die 6-sided die, get 1 to 6 with equal probability. Let X denote the result. Then P (X = i) = 1/6 if i = 1,..., 6. 1/

5 PMF: number of heads in two flips HH, HT, TH, TT: each with 1/4 probability. Let N h be the number of heads. P (N h = 0) = P ({T T }) = 1/4. P (N h = 1) = P ({HT, T H}) = 1/2. P (N h = 2) = P ({HH}) = 1/4. So, the PMF is... 5

6 Histogram A graphical display of frequencies numbers, 6 zeros and 4 ones. Frequency of zero is 6/10. Frequency of one is 4/10. The histogram is 6/10 4/

7 In Matlab Use hist(). Example: x=randint(1, 1e6, 100); hist(x, 100); Caution: matlab does not divide by the total number of samples. 7

8 PDF: probability density function PDF tells us how the total probability 100% is distributed across the values Waiting time T at a bus station: uniformly distributed between 0 and 10. f T (t) = 1/10 for t [0, 10]. Voltage X across a resistor: thermal noise Gaussian distributed. f X (x) = 1 ) ( 2πσ 2 exp x2 2σ 2. where σ 2 depends on the bandwidth of the voltage meter, the resistance, and temperature. 8

9 Property of PDF P (a < X < b) = b a f X(x)dx f X(x) = 1 f X (x) 0, x 9

10 Random Variables: Definition An assignment of a real number to each point in sample space. A function mapping sample space into the real line. Random variable X Sample Space Ω x Real number line Idea: Randomness is in the experiment. A random variable just assigns a number to each sample point. Each run of the experiment yields a specific sample point ω, which produces a sample value, say x = X(ω), of the random variable. 10

11 Random variable vs. probability law Probability Law Sample Space Ω 0 1 A Probability Law maps an event (subset of Ω) to [0, 1] 11-1

12 Random variable vs. probability law Probability Law Sample Space Ω Random variable X 0 1 A Probability Law maps an event (subset of Ω) to [0, 1] A Random Variable maps an outcome (element of Ω) to (, ) 11-2

13 Example of Random variable Throw two 4-sided dice Random variable X: the maximum of the two throws 4 3 2nd roll Real Number Line 1st roll Random Variable X = max. roll 12-1

14 Example of Random variable More Examples: Let ω = {ω 1, ω 2 }, ω i = value of the ith roll X = product of two faces, X(ω) = ω 1 ω 2 Y = (difference) 2, Y (ω) = (ω 1 ω 2 ) 2 R = outcome of first roll, R(ω) = ω 1. S = outcome of second roll, S(ω) = ω

15 Probability Mass Functions Abbreviation: PMF; Suitable for discrete RVs For continuous RVs, will use Probability Density Function (PDF) Definition and Notation:p X (x) = P ({X = x}) Probability Law Sample Space Ω Random variable X 0 1 Notation: upper case letters for RVs and lower ones for their values 13

16 Calculation of PMF of a Random Variable For each of the value x the RV X can take, do the following Collect all the possible outcomes that give rise to {X = x} Obtain p X (x) = P r({x = x}) Example: Let X be the number of heads obtained from two independent tosses of a fair coin p X (x) p X (x) = 1/4, if x = 0 or x = 2 1/2, if x = 1 0, otherwise 1/4 0 1/ /4 x 14

17 Remarks about PMF PMF Properties 0 p X (x) 1 for all x x p X(x) = 1 Random variables are often referred to according to their PMF s. E.g., we say that a random variable is a Bernoulli RV if its PMF is { p, if x = 0 p X (x) = 1 p, if x = 1 15

18 Binomial PMF Binomial: Toss a coin n times, and let X denote the number of heads ( ) n p X (k) = P (X = k) = p k (1 p) n k, k = 0, 1,..., n k Binormial PMF n=9, p=0.5 Binormial PMF n=90, p=0.1 16

19 Geometric and Poisson Geometric PMF: Number of tosses for a head to show up p X (k) = (1 p) k 1 p, k = 1, 2,... p Geometric PMF, p=0.2 Poisson PMF λ λk p X (k) = e, k = 0, 1, 2 k!... Good for modeling the total effect of independent small probability events Used extensively in queuing theory 17

20 Poisson PMF (cont d) A good approximation of Binomial PMF for large n and small p λ λk e k! n! k!(n k)! pk (1 p) n k, k = 0, 1,..., n where λ = np, n is large and p is small Poisson, λ=0.5 Poisson, λ=

21 Functions of Random Variables Suppose we have a RV X defined on an experiment We can define functions of the random variable Y = g(x), such as X 2, X, cos(x), e 3X. For each function, Y will be a random number that we can get every time we run the experiment. These are functions of a random variable, and are randomvariables themselves. PMF of Y, p Y (y), can be found from the original probabilitymodel or from the PMF of X 19

22 FUNCTIONS OF RANDOM VARIABLES Sample Space set of possible values of X set of possible values of Y B(y) A(y) y B(y)= set of outcomes corresponding to y B(y) = {x g(x) = y} p Y (y) = P (B(y)) = P (X A(y)) p Y (y) = P ({ω : Y (ω) = y}) = P (ω) = P (ω) = ω:g(x(ω))=y x:g(x)=y ω:x(ω)=x x:g(x)=y p X (x) 20

23 EXAMPLE X = number of heads in three tosses of a fair coin PMF of X is P X (x) = 1/8 x = 0, 3 3/8 x = 1, 2 0 otherwise Y = g(x) = 3 X (number of tails) Find the PMF of Y P Y (y) = P r(y = y) = P r(3 X = y) = P r(x = 3 y) = P X (3 y) So P Y (0) = P X (3) = 1/8, P Y (1) = P X (2) = 3/8, etc 21

24 Exercise X = number of heads in three tosses of a fair coin Y = X 2 P Y (y) = P r(y = y) = P r( X 2 = y) Find the PMF of Y 22

25 Expectation, Mean, Variance OUTLINE Review of Concepts of Random Variables and PMF Functions of Random Variables Expectation, Mean, Variance Reading: Bertsekas & Tsitsiklis, 2.3,

26 Expectation The expectation or mean of a random variable X, with PMF p X (x), is defined as E[X] = x xp X (x) center of gravity c = mean = E[X] x E[X]=Sum of product of values and their probabilities It is the way you compute your GPA. 24

27 Variance, Moments, Standard deviation 2nd Moment: E[X 2 ], nth Moment: E[X n ] Variance var(x) = σ 2 X = E [ (X E[X]) 2] Always non-negative Standard deviation σ X = var(x) 25

28 Expected Value Properties Linearity of the expectation E[g 1 (X) + g 2 (X)] = E[g 1 (X)] + E[g 2 (X)] Variance in terms of moments: Var(X) = E[X 2 ] (E[X]) 2 Mean and variance of a linear function of a RV. Let Y = ax + b E[Y ] = ae[x] + b, Var(Y ) = a 2 Var(X) 26

29 Mean and Variance of Bernoulli RV Bernoulli PMF: p X (x) = { 1 p, if x = 0 p, if x = 1 Mean, second moment, and variance E[X] = 1 p + 0 (1 p) = p E[X 2 ] = 1 2 p (1 p) = p Var[X 2 ] = E[X 2 ] (E[X]) 2 = p p 2 = p(1 p) Note: The mean may not be an outcome. 27

30 Six-Sided Die Let X denote the number we get from a single roll of a 6-sided die. { 1/6, if x = 1, 2, 3, 4, 5, 6 p X (x) = 0, otherwise E[X] = 3.5 E[X 2 ] = 1 6 ( ) = 91 6 Var[X] = = σ X = Var[X] = 35/12 28

31 A generalization: n-sided die Let X denote the RV that takes values between 1 and n, inclusive, equally likely { 1/n, if x = 1, 2, 3,..., n p X (x) = 0, otherwise E[X] = 1 + n 2 E[X 2 ] = 1 n k 2 = n Var[X] = k=1 (n + 1)(2n + 1) 6 (n + 1)(2n + 1) 6 ( ) n = n

32 Geometric RV Geometric PMF The number of tosses needed for a head p X (k) = (1 p) k 1 p, k = 1,..., n In order to compute the mean, we will need the following fact ka k 1 = k=1 1 (1 a) 2 which can be obtained by taking the derivative w.r.t. a the following p Geometric PMF, p= a k = k=1 a 1 a E[X] = 1 p, Var[X] = 1 p p 2 30

33 Expected Value Rule Let Y = g(x) be a function of the RV We can verify that E[Y ] = y yp Y (y) = x g(x)p X (x) Discrete case E[g(X)] = x g(x)p X (x) Continuous case E[g(X)] = g(x)f X (x)dx 31

34 Example of ER To understand the ER, we may use a small example. Let X be a RV that is equal to 1, 2, 3, and 4 with equal probability 1/4 (so there is a 4-face die). Let Y = (X 2) 2 which is a function of the RV X. x p X (x) (x 2) 2 1 1/ / / /4 4 We want to find E[Y ]. By definition of expectation, we know E[Y ] = y yp Y (y), where p Y (y) = 1/4 y = 0 1/4 y = 4 1/4 + 1/4 y =

35 Example of ER Therefore, E[Y ] = But tracing back to where 4, 0, 1 and 1 4 and 1 2 come from, we know that E[Y ] can also be written as E[Y ] = (2 2) (4 2) (1 2) (3 2)2 1 4 and this is the ER. The ER is actually very intuitive from the table: there is 1/4 probability of X = 2, and Y = (2 2) 2, there is 1/4 probability of X = 4, and Y = (4 2) 2, there is 1/4 probability of X = 1, and Y = (1 2) 2, there is 1/4 probability of X = 3, and Y = (3 2) 2. Summing up all the four possibilities of Y (two of them have the same Y value), weighted by the probability 1/4, we have the expectation (average) of Y. 32-2

36 ER for Conditional Expectation In addition to ER for unconditional expectation, we also have conditional expectation, where one would replace the unconditional PDF or PMF with the conditional ones: The ER for conditional expectation as follows: E[g(X) Y = y] = x g(x)p X Y (x y) discrete case (1) E[g(X) Y = y] = g(x)f X Y (x y)dx continuous case (2) 33

37 Multiple Random Variables Reading: Bertsekas & Tsitsiklis, We have seen that we can define many RVs on a given experiment. So far, we considered PMF for only one RV at a time. Ideas extend to two (or more) RVs. Given an experiment with two RVs X and Y defined on it, the joint PMF is defined by p X,Y (x, y) = P (X = x and Y = y) = P ({ω : X(ω) = x and Y (ω) = y}). Each performance of the experiment produces a sample (x, y), a random pair of numbers. From the additivity and normalization axioms of probability p X,Y (x, y) = 1. x,y 34

38 Intuitive Example (from C. Bishop) A fiendish murder has been committed Two suspects: the Butler or the cook There are three possible murder weapons: a Knife a Pistol a fireplace Poker Source: slides.pdf 35

39 Prior Distribution Butler has served family well for many years Cook hired recently, rumours of dodgy history 36

40 Conditional Distribution Butler is ex-army, keeps a gun in a locked drawer Cook has access to lots of knives Butler is older and getting frail 37

41 Joint Distribution What is the probability that the Cook committed the murder using the Pistol? Likewise for the other five combinations of Culprit and Weapon 38-1

42 Joint Distribution x y Sum = 100% p(x, y) = p(x)p(y x) Product Rule 38-2

43 Marginal Distribution of Culprit x y P (x) = y P (x, y) Sum Rule 39

44 Marginal Distribution of Weapon x y P (y) = x P (x, y) Sum rule 40

45 Posterior Distribution We discover a Pistol at the scene of the crime 20% 80% 41

46 Bayes Theorem P (x, y) = P (x)p (y x) = P (y)p (x y) likelihood Prior P (x y) = P (y x)p (x) P (y) posterior Prior belief before making a particular observation Posterior belief after making the observation Posterior is the prior for the next observation Intrinsically incremental 42

47 The Rules of Probability Sum Rule P (x) = y P (x, y) Product Rule P (x, y) = P (x)p (y x) Bayes Rule Denominator P (x y) = P (y x)p (x) P (y) P (y) = x P (x)p (y x) 43

48 Multiple Random Variables - Example Our usual 2-dice problem: Let X be the outcome of the first roll, Y of the second; then p X,Y (x, y) = 1 16, x = 1, 2, 3, 4; y = 1, 2, 3, 4. Same experiment: let S =sum of two faces, T =product. Here a table might help: the rows correspond to toss 1, the columns to toss 2. Each entry shows the sample values of (S, T ) resulting from the underlying sample point: (2,1) (3,2) (4,3) (5,4) 2 (3,2) (4,4) (5,6) (6,8) 3 (4,3) (5,6) (6,9) (7,12) 4 (5,4) (6,8) (7,12) (8,16) 44-1

49 Multiple Random Variables - Example Since each of the original sample points has probability 1/16, the new joint PMF can be evaluated by suitable summing, e.g. 1 16, for (s, t) = (2, 1), (4, 4), (6, 9), (8, 16), p S,T (s, t) = 1 8, for (s, t) = (3, 2), (4, 3), (5, 4), (5, 6), (6, 8), (7, 12). It can be checked that this PMF sums to

50 Marginal PMFs Suppose X and Y are two random variables defined on a common experiment and that they have a joint PMF p X,Y (x, y). As before, we can also determine their individual or marginal PMFs p X (x) and p Y (y). How are these related? Using total probability: p X (x) = P (X = x) = y P (X = x and Y = y) = y p X,Y (x, y). The Sum Rule: p X (x) = y p X,Y (x, y) p Y (y) = x p X,Y (x, y) 45

51 Joint and Marginal PMFs Joint PMF p X,Y (x, y) in tabular form y x 1 x 2 x 3 x 4 Row sums: marginal PMF p Y (y) y 1 0 1/20 1/20 1/20 3/20 y 2 1/20 2/20 3/20 1/20 7/20 y 3 1/20 2/20 3/20 1/20 7/20 y 4 Column sums: marginal PMF p X (x) 1/20 1/20 1/20 0 3/20 6/20 8/20 3/20 3/20 x 46

52 Function of Multiple random variables Z = g(x, Y ) p Z (z) = {(x,y) g(x,y)=z} p X,Y (x, y) Expected value rules: E[g(X, Y )] = x,y g(x, y)p X,Y (x, y) E[g 1 (X, Y ) + g 2 (X, Y )] = E[g 1 (X, Y )] + E[g 2 (X, Y )] Extension to more than two RVs: [ n E i=1 a i X i ] = n a i E[X i ] i=1 47

53 Example: mean of the binomial Binomial RV X: the number of heads in n tosses Mean: E[X] = n ( ) n k p k (1 p) n k (not easy) k k=0 Define the indicator RVs { 1 if the ith toss is a head (prob. p), X i = 0 if the ith toss is a tail (prob. 1 p), Then X = X X n. Now E[X] = n E[X i ] = i=1 n p = np i=1 48

54 Summary of Joint PMF Joint PMF of X and Y p X,Y (x, y) = P ({X = x} {Y = y}). Marginal PMF from joint PMF p X (x) = y p X,Y (x, y). Expected value rule: E[g(X, Y )] = x,y g(x, y)p X,Y (x, y) Linearity: [ n E i=1 a i X i ] = n a i E[X i ] i=1 49

55 Conditioning Conditional PMF of X given Y = y (hence, A = {Y = y}) p X Y (x y) = p X,Y (x, y) p Y (y) Conditional PMF of X given that event A occurs p X A (x A) = P (X = x A) Properties immediately from results for conditional probabilities: p X,Y (x, y) = p Y (y)p X Y (x y) Product Rule p X (x) = y p Y (y)p X Y (x y) Total Probability for RVs. (Sum Rule) 50

56 Slice View of Conditional PMF P X Y (x 3) y Jont PMF P X Y (x 2) x y=3 y=2 y=1 x P X Y (x 1) x x 51

57 Conditional PMF: Example Joint PMF p X,Y (x, y) in tabular form y x 1 x 2 x 3 x 4 Row sums: marginal PMF p Y (y) y 1 0 1/20 1/20 1/20 3/20 y 2 1/20 2/20 3/20 1/20 7/20 y 3 1/20 2/20 3/20 1/20 7/20 y 4 Column sums: marginal PMF p X (x) What is P X Y (x y)? 1/20 1/20 1/20 0 3/20 6/20 8/20 3/20 3/20 x 52

58 Conditional PMF: Example P X Y (x y) = P r(x = x Y = y) = P X,Y (x,y) P Y (y). P X Y (x y 1 ) = 0 3, 1 3, 1 3, 1 3, for x = x 1, x 2, x 3, x 4. P X Y (x y 2 ) = 1 7, 2 7, 3 7, 1 7, for x = x 1, x 2, x 3, x 4. P X Y (x y 3 ) = 1 7, 2 7, 3 7, 1 7, for x = x 1, x 2, x 3, x 4. P X Y (x y 4 ) = 1 3, 1 3, 1 3, 0 3, for x = x 1, x 2, x 3, x 4. 53

59 Example: passing the exam n times A student takes a test repeatedly until he passes, or fail the test after n trials. Probability of pass in each test is p. Geometric: p X (k) = p(1 p) k 1 What is the PMF for the number of attempts given that he passes? Let A = {X n} P (A) = n (1 p) m 1 p m=1 The conditional PMF p X A (k) = { (1 p) k 1 p n m=1 (1 p)m 1 p, if k = 1,..., n 0, otherwise 54

60 Example: passing the exam n times p X (k) p p X A (k) p/p(a) n k 0 1 n k 55

61 Conditional Expectation Conditional expectation of X given event A E[X A] = x xp X A (x A) Conditional expectation of X given Y = y E[X Y = y] = x xp X Y (x y) 56

62 Conditional Expectation - Properties Total expectation theorem for RVs E[X] = y p Y (y)e[x Y = y] The result follows from the definitions and total probability. p Y (y)e[x Y = y] = p Y (y) xp X Y (x y) (3) y y x = x x y p Y (y)p X Y (x y) (4) = x xp X (x) = E[X]. (5) 57

63 Conditional Expectation The conditional expectations of course depends on what value Y takes. For a different y, we have a different conditional expectation. That is E[X y] is a function of y. Replacing the argument of any function with a RV will give us a function of a RV. So, E[X Y ] is a function of Y. Using the ER, with g(y ) replaced with E[X Y ], we can evaluate E[ E[X Y ] ]. 58

64 Independence Two random variables X and Y are called independent if p X,Y (x, y) = p X (x) p Y (y) for all x, y i.e. if the events {X = x} and {Y = y} are independent in the original sense for all possible x and y. It follows that p X (x) = p X Y (x y) If X and Y are independent and g(x) and h(y) are functions,then E[g(X)h(Y )] = x,y g(x)h(y)p X,Y (x, y) = x,y g(x)h(y)p X (x)p Y (y) = ( g(x)p X (x) )( h(y)p Y (y) ) x y = E[g(X)]E[h(Y )] 59

65 Independence - Example Y Z 4 1/20 2/20 2/ /30 2/30 1/30 2/30 3 2/20 4/20 1/20 2/20 3 2/30 4/30 2/30 4/ /20 3/20 1/20 0 1/ X Are X and Y independent? 2 1 1/30 2/30 1/30 2/30 1/30 2/30 1/30 2/ W Are Z and W independent? 60

66 Sums of Independent RVs If X 1, X 2,..., X n are independent E[X X n ] = E[X 1 ] + + E[X n ] var[x X n ] = var[x 1 ] + + var[x n ]. Previously, we also had E[aX + b] = ae[x] + b, and Var[aX + b] = a 2 Var[X] The mean result is true regardless ofindependence The variance result makes use of the independence to get rid of the cross terms. Let s prove the variance result for n = 2 case 61

67 Sums of Independent RVs Example: Mean and variance of the Binomial where X = X X n { 1 if the ith toss is a head (prob. p), X i = 0 if the ith toss is a tail (prob. 1 p), n n E[X] = E[X i ] = p = np var[x] = i=1 i=1 n var[x i ] = np(1 p). i=1 62

68 The Sample Mean An important special case of sums of random variables occurs when a i = 1/n for all i and the random variables X i have a common mean µ and variance σx 2. The sample mean is defined by S n = 1 n X i. n This is also called the sample average. From the previous formulas E[S n ] = n i=1 var[s n ] = 1 n 2 (recall that var[ax] = a 2 var[x].) i=1 1 n E[X i] = µ, n i=1 var[x i ] = σ2 X n. 63

69 The Sample Mean This is actually a very important result since it tells us that the variance of the S n shrinks to zero as the number of samples goes to infinity. This will be a key fact in the proof of the law of large numbers. 64

70 The Sample Mean - Example We want to reduce the error of a measurement of some quantity m by doing the measurement independently multiple times, and then average. We know that for each measurement the error standard deviation is δ. How many measurement do we need to reduce the error (standard deviation) to 0.1 δ? Model: Let X i denote the result in the ith measurement. We modelx i as a random variable due to measurement error. We model the mean of X i as m. The variance is δ 2 The question is: For what n is the standard deviation of n S n = 1 n i=1 X i equal to 0.1δ. 65

71 Review of the topics in Chapter 2 Concept of random variables Probability mass function: definition, calculation, properties Useful PMFs: Binomial, Geometric, Poisson Expectation mean and variance expected value rule linearity rule 66-1

72 Review of the topics in Chapter 2 Multiple RV joint and marginal PMFs conditional PMF conditional expectation independence expectation of product of independent RVs expectation and variance of sum of independent RVs 66-2

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

More information

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i ) Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit? ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................

More information

Lesson 5 Chapter 4: Jointly Distributed Random Variables

Lesson 5 Chapter 4: Jointly Distributed Random Variables Lesson 5 Chapter 4: Jointly Distributed Random Variables Department of Statistics The Pennsylvania State University 1 Marginal and Conditional Probability Mass Functions The Regression Function Independence

More information

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage 5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single

More information

Bivariate Distributions

Bivariate Distributions Chapter 4 Bivariate Distributions 4.1 Distributions of Two Random Variables In many practical cases it is desirable to take more than one measurement of a random observation: (brief examples) 1. What is

More information

ST 371 (VIII): Theory of Joint Distributions

ST 371 (VIII): Theory of Joint Distributions ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or

More information

L10: Probability, statistics, and estimation theory

L10: Probability, statistics, and estimation theory L10: Probability, statistics, and estimation theory Review of probability theory Bayes theorem Statistics and the Normal distribution Least Squares Error estimation Maximum Likelihood estimation Bayesian

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Normal distribution Approximating binomial distribution by normal 2.10 Central Limit Theorem

Normal distribution Approximating binomial distribution by normal 2.10 Central Limit Theorem 1.1.2 Normal distribution 1.1.3 Approimating binomial distribution by normal 2.1 Central Limit Theorem Prof. Tesler Math 283 October 22, 214 Prof. Tesler 1.1.2-3, 2.1 Normal distribution Math 283 / October

More information

4. Joint Distributions of Two Random Variables

4. Joint Distributions of Two Random Variables 4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint

More information

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away) : Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest

More information

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

4: Probability. What is probability? Random variables (RVs)

4: Probability. What is probability? Random variables (RVs) 4: Probability b binomial µ expected value [parameter] n number of trials [parameter] N normal p probability of success [parameter] pdf probability density function pmf probability mass function RV random

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

You flip a fair coin four times, what is the probability that you obtain three heads.

You flip a fair coin four times, what is the probability that you obtain three heads. Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.

More information

6. Jointly Distributed Random Variables

6. Jointly Distributed Random Variables 6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.

More information

Topic 8 The Expected Value

Topic 8 The Expected Value Topic 8 The Expected Value Functions of Random Variables 1 / 12 Outline Names for Eg(X ) Variance and Standard Deviation Independence Covariance and Correlation 2 / 12 Names for Eg(X ) If g(x) = x, then

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 3-7, 3-8 The remaining discrete random

More information

Examination 110 Probability and Statistics Examination

Examination 110 Probability and Statistics Examination Examination 0 Probability and Statistics Examination Sample Examination Questions The Probability and Statistics Examination consists of 5 multiple-choice test questions. The test is a three-hour examination

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

by Dimitri P. Bertsekas and John N. Tsitsiklis SECTION 3.1. Continuous Random Variables and PDFs

by Dimitri P. Bertsekas and John N. Tsitsiklis SECTION 3.1. Continuous Random Variables and PDFs INTRODUCTION TO PROBABILITY by Dimitri P. Bertsekas and John N. Tsitsiklis CHAPTER 3: ADDITIONAL PROBLEMS SECTION 3.1. Continuous Random Variables and PDFs Problem 1. The runner-up in a road race is given

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Sufficient Statistics and Exponential Family. 1 Statistics and Sufficient Statistics. Math 541: Statistical Theory II. Lecturer: Songfeng Zheng

Sufficient Statistics and Exponential Family. 1 Statistics and Sufficient Statistics. Math 541: Statistical Theory II. Lecturer: Songfeng Zheng Math 541: Statistical Theory II Lecturer: Songfeng Zheng Sufficient Statistics and Exponential Family 1 Statistics and Sufficient Statistics Suppose we have a random sample X 1,, X n taken from a distribution

More information

MAS108 Probability I

MAS108 Probability I 1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

More information

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Statistics 100 Binomial and Normal Random Variables

Statistics 100 Binomial and Normal Random Variables Statistics 100 Binomial and Normal Random Variables Three different random variables with common characteristics: 1. Flip a fair coin 10 times. Let X = number of heads out of 10 flips. 2. Poll a random

More information

RANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013. 3 k) ( 52 3 )

RANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013. 3 k) ( 52 3 ) RANDOM VARIABLES MATH CIRCLE (ADVANCED) //0 0) a) Suppose you flip a fair coin times. i) What is the probability you get 0 heads? ii) head? iii) heads? iv) heads? For = 0,,,, P ( Heads) = ( ) b) Suppose

More information

Topic 4: Multivariate random variables. Multiple random variables

Topic 4: Multivariate random variables. Multiple random variables Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly

More information

Probability and Statistics

Probability and Statistics CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b - 0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute - Systems and Modeling GIGA - Bioinformatics ULg kristel.vansteen@ulg.ac.be

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Joint Distribution and Correlation

Joint Distribution and Correlation Joint Distribution and Correlation Michael Ash Lecture 3 Reminder: Start working on the Problem Set Mean and Variance of Linear Functions of an R.V. Linear Function of an R.V. Y = a + bx What are the properties

More information

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions... MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 2004-2012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................

More information

Topic 8: The Expected Value

Topic 8: The Expected Value Topic 8: September 27 and 29, 2 Among the simplest summary of quantitative data is the sample mean. Given a random variable, the corresponding concept is given a variety of names, the distributional mean,

More information

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1

ECE302 Spring 2006 HW7 Solutions March 11, 2006 1 ECE32 Spring 26 HW7 Solutions March, 26 Solutions to HW7 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

MAT 1000. Mathematics in Today's World

MAT 1000. Mathematics in Today's World MAT 1000 Mathematics in Today's World We talked about Cryptography Last Time We will talk about probability. Today There are four rules that govern probabilities. One good way to analyze simple probabilities

More information

Estimation with Minimum Mean Square Error

Estimation with Minimum Mean Square Error C H A P T E R 8 Estimation with Minimum Mean Square Error INTRODUCTION A recurring theme in this text and in much of communication, control and signal processing is that of making systematic estimates,

More information

Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1. Be able to apply Bayes theorem to compute probabilities. 2. Be able to identify

More information

STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case

STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case Pengyuan (Penelope) Wang June 20, 2011 Joint density function of continuous Random Variable When X and Y are two continuous

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note 11

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note 11 CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

4.1 4.2 Probability Distribution for Discrete Random Variables

4.1 4.2 Probability Distribution for Discrete Random Variables 4.1 4.2 Probability Distribution for Discrete Random Variables Key concepts: discrete random variable, probability distribution, expected value, variance, and standard deviation of a discrete random variable.

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability CS 7 Discrete Mathematics and Probability Theory Fall 29 Satish Rao, David Tse Note 8 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces

More information

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80)

Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) Discrete Math in Computer Science Homework 7 Solutions (Max Points: 80) CS 30, Winter 2016 by Prasad Jayanti 1. (10 points) Here is the famous Monty Hall Puzzle. Suppose you are on a game show, and you

More information

4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions 4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

More information

University of California, Los Angeles Department of Statistics. Random variables

University of California, Los Angeles Department of Statistics. Random variables University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.

More information

Chapter 5. Random variables

Chapter 5. Random variables Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

More information

Random Variable: A function that assigns numerical values to all the outcomes in the sample space.

Random Variable: A function that assigns numerical values to all the outcomes in the sample space. STAT 509 Section 3.2: Discrete Random Variables Random Variable: A function that assigns numerical values to all the outcomes in the sample space. Notation: Capital letters (like Y) denote a random variable.

More information

Aggregate Loss Models

Aggregate Loss Models Aggregate Loss Models Chapter 9 Stat 477 - Loss Models Chapter 9 (Stat 477) Aggregate Loss Models Brian Hartman - BYU 1 / 22 Objectives Objectives Individual risk model Collective risk model Computing

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected

More information

Chapter 2, part 2. Petter Mostad

Chapter 2, part 2. Petter Mostad Chapter 2, part 2 Petter Mostad mostad@chalmers.se Parametrical families of probability distributions How can we solve the problem of learning about the population distribution from the sample? Usual procedure:

More information

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

More information

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved. 3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2014 1 2 3 4 5 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Experiment, outcome, sample space, and

More information

TRANSFORMATIONS OF RANDOM VARIABLES

TRANSFORMATIONS OF RANDOM VARIABLES TRANSFORMATIONS OF RANDOM VARIABLES 1. INTRODUCTION 1.1. Definition. We are often interested in the probability distributions or densities of functions of one or more random variables. Suppose we have

More information

Chapter 5: Multivariate Distributions

Chapter 5: Multivariate Distributions Chapter 5: Multivariate Distributions Professor Ron Fricker Naval Postgraduate School Monterey, California 3/15/15 Reading Assignment: Sections 5.1 5.12 1 Goals for this Chapter Bivariate and multivariate

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

2. Discrete random variables

2. Discrete random variables 2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

Stats on the TI 83 and TI 84 Calculator

Stats on the TI 83 and TI 84 Calculator Stats on the TI 83 and TI 84 Calculator Entering the sample values STAT button Left bracket { Right bracket } Store (STO) List L1 Comma Enter Example: Sample data are {5, 10, 15, 20} 1. Press 2 ND and

More information

Copyright 2013 by Laura Schultz. All rights reserved. Page 1 of 6

Copyright 2013 by Laura Schultz. All rights reserved. Page 1 of 6 Using Your TI-NSpire Calculator: Binomial Probability Distributions Dr. Laura Schultz Statistics I This handout describes how to use the binompdf and binomcdf commands to work with binomial probability

More information

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions Math 70/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 2 Solutions About this problem set: These are problems from Course /P actuarial exams that I have collected over the years,

More information

Master s Theory Exam Spring 2006

Master s Theory Exam Spring 2006 Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem

More information

Mathematical Expectation

Mathematical Expectation Mathematical Expectation Properties of Mathematical Expectation I The concept of mathematical expectation arose in connection with games of chance. In its simplest form, mathematical expectation is the

More information

4. Joint Distributions

4. Joint Distributions Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose

More information

Probability distributions

Probability distributions Probability distributions (Notes are heavily adapted from Harnett, Ch. 3; Hayes, sections 2.14-2.19; see also Hayes, Appendix B.) I. Random variables (in general) A. So far we have focused on single events,

More information

Random Variables. Chapter 2. Random Variables 1

Random Variables. Chapter 2. Random Variables 1 Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets

More information

Notes 11 Autumn 2005

Notes 11 Autumn 2005 MAS 08 Probabilit I Notes Autumn 005 Two discrete random variables If X and Y are discrete random variables defined on the same sample space, then events such as X = and Y = are well defined. The joint

More information

Statistics 100A Homework 7 Solutions

Statistics 100A Homework 7 Solutions Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

More information

Discrete Mathematics for CS Fall 2006 Papadimitriou & Vazirani Lecture 22

Discrete Mathematics for CS Fall 2006 Papadimitriou & Vazirani Lecture 22 CS 70 Discrete Mathematics for CS Fall 2006 Papadimitriou & Vazirani Lecture 22 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice, roulette

More information

Problem sets for BUEC 333 Part 1: Probability and Statistics

Problem sets for BUEC 333 Part 1: Probability and Statistics Problem sets for BUEC 333 Part 1: Probability and Statistics I will indicate the relevant exercises for each week at the end of the Wednesday lecture. Numbered exercises are back-of-chapter exercises from

More information

The Method of Lagrange Multipliers

The Method of Lagrange Multipliers The Method of Lagrange Multipliers S. Sawyer October 25, 2002 1. Lagrange s Theorem. Suppose that we want to maximize (or imize a function of n variables f(x = f(x 1, x 2,..., x n for x = (x 1, x 2,...,

More information

Exercises with solutions (1)

Exercises with solutions (1) Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal

More information

m (t) = e nt m Y ( t) = e nt (pe t + q) n = (pe t e t + qe t ) n = (qe t + p) n

m (t) = e nt m Y ( t) = e nt (pe t + q) n = (pe t e t + qe t ) n = (qe t + p) n 1. For a discrete random variable Y, prove that E[aY + b] = ae[y] + b and V(aY + b) = a 2 V(Y). Solution: E[aY + b] = E[aY] + E[b] = ae[y] + b where each step follows from a theorem on expected value from

More information

Principle of Data Reduction

Principle of Data Reduction Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 10 Introduction to Discrete Probability Probability theory has its origins in gambling analyzing card games, dice,

More information

Joint Distributions. Tieming Ji. Fall 2012

Joint Distributions. Tieming Ji. Fall 2012 Joint Distributions Tieming Ji Fall 2012 1 / 33 X : univariate random variable. (X, Y ): bivariate random variable. In this chapter, we are going to study the distributions of bivariate random variables

More information

6 PROBABILITY GENERATING FUNCTIONS

6 PROBABILITY GENERATING FUNCTIONS 6 PROBABILITY GENERATING FUNCTIONS Certain derivations presented in this course have been somewhat heavy on algebra. For example, determining the expectation of the Binomial distribution (page 5.1 turned

More information

Chapter 4. Multivariate Distributions

Chapter 4. Multivariate Distributions 1 Chapter 4. Multivariate Distributions Joint p.m.f. (p.d.f.) Independent Random Variables Covariance and Correlation Coefficient Expectation and Covariance Matrix Multivariate (Normal) Distributions Matlab

More information

PROBABILITIES AND PROBABILITY DISTRIBUTIONS

PROBABILITIES AND PROBABILITY DISTRIBUTIONS Published in "Random Walks in Biology", 1983, Princeton University Press PROBABILITIES AND PROBABILITY DISTRIBUTIONS Howard C. Berg Table of Contents PROBABILITIES PROBABILITY DISTRIBUTIONS THE BINOMIAL

More information

Definition 6.1.1. A r.v. X has a normal distribution with mean µ and variance σ 2, where µ R, and σ > 0, if its density is f(x) = 1. 2σ 2.

Definition 6.1.1. A r.v. X has a normal distribution with mean µ and variance σ 2, where µ R, and σ > 0, if its density is f(x) = 1. 2σ 2. Chapter 6 Brownian Motion 6. Normal Distribution Definition 6... A r.v. X has a normal distribution with mean µ and variance σ, where µ R, and σ > 0, if its density is fx = πσ e x µ σ. The previous definition

More information

Chapter 1. Probability, Random Variables and Expectations. 1.1 Axiomatic Probability

Chapter 1. Probability, Random Variables and Expectations. 1.1 Axiomatic Probability Chapter 1 Probability, Random Variables and Expectations Note: The primary reference for these notes is Mittelhammer (1999. Other treatments of probability theory include Gallant (1997, Casella & Berger

More information