Random Variables and Their Expected Values


 Carol Bryant
 2 years ago
 Views:
Transcription
1
2 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function
3 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Definition Let S be the sample space of some probabilistic experiment. A function X : S R is called a random variable. Example 1. A unit is selected at random from a population of units. Thus S is the collection of units in the population. Suppose a characteristic (weight, volume, or opinion on a certain matter) is recorded. A numerical description of the outcome is a random variable. 2. S = {s = (x 1..., x n ) : x i R, i}, X (s) = i x i or X (s) = x, or X (s) = max{x 1..., x n }. 3. S = {s : 0 s < } (e.g. we may be recording the life time of an electrical component), X (s) = I (s > 1500), or X (s) = s, or X (s) = log(s).
4 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function A random variable X induces a probability measure on the range of its values, which is denoted by X (S). X (S) can be thought of as the sample space of a compound experiment which consists of the original experiment, and the subsequent transformation of the outcome into a numerical value. Because the value X (s) of the random variable X is determined from the outcome s, we may assign probabilities to the possible values of X. For example, if a die is rolled and we define X (s) = 1 for s = 1, 2, 3, 4, and X (s) = 0 for s = 5, 6, then P(X = 1) = 4/6, P(X = 0) = 2/6. The probability measure P X, induced on X (S) by the random variable X, is called the (probability) distribution of X.
5 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function The distribution of a random variable is considered known if the probabilities P X ((a, b]) = P(a < X b) are known for all a < b. Definition A random variable X is called discrete if X (S) is a finite or a countably infinite set. If X (S) is uncountably infinite, then X is called continuous. For discrete r.v. s X, P X is completely specified by the probabilities P X ({k}) = P(X = k), for each k X (S). (See part 4 of Theorem on p. 15 of 2nd lecture slide.) The function p(x) = P(X = x) is called the probability mass function of X.
6 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Consider a batch of size N = 10 products, 3 of which are defective. Draw 3 at random and without replacement and let the r.v. X denote the number of defective items. Find the pmf of X. Solution: The sample space of X is S X = {0, 1, 2, 3}, and: ( 7 )( 3 3) 1) P(X = 0) = ( 10 3 ), P(X = 1) = ( 7 2 ( 10 3 ), P(X = 2) = ( 7 )( 3 1 ( 2) 10 ), P(X = 3) = 3 ( 3 3) ( 10 3 ) Thus, the pmf of X is x p(x)
7 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Bar Graph Probability Mass Function x values
8 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Let S = {s = (x 1..., x n ) : x i = 0 or 1}, i} (so S is the sample space of n flips of a coin), and let p be the probability of 1 (i.e. probability of heads). Then, the probability measure on S is P({(x 1..., x n )}) = p P i x i (1 p) n P i x i. Let X (s) = i x i, so X (S) = {0, 1,..., n}. Then, the distribution, P X, of X, which is a probability measure on X (S), is called the Binomial distribution. Suppose n = 3 and p = 1/2. Then, P X (0) = P(X = 0) = 1 8, P X (1) = P(X = 1) = 3 8 why?, P X (2) = P(X = 2) = 3 8, P X (3) = P(X = 3) = 1 8.
9 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Three balls are randomly selected at random and without replacement from an urn containing 20 balls numbered 1 through 20. Find the probability that at least one of the balls will have number 17. Solution: Here S = {s = (i 1, i 2, i 3 ) : 1 i 1, i 2, i 3 20}, X (s) = max{i 1, i 2, i 3 }, X (S) = {3, 4,..., 20} and we want to find P(X 17) = P(X = 17) + P(X = 18) + P(X = 19) + P(X = 20). These are found from the formula ) P(X = k) = ( k 1 2 ( 20 3 ) (why?) The end result is P(X 17) =
10 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function The PMF of a Function of X Let X be a discrete random variable with range (i.e. set of possible values) X and distribution P X, and let Y = g(x ) be a function of X with range Y. Then the pmf p Y (y) of Y is given in terms of the pmf p X (x) of X by p Y (y) = x X :g(x)=y p X (x), for all y Y. Example Roll a die and let X denote the outcome. If X = 1 or 2, you win $1; if X = 3 you win $2, and if X 4 you win $4. Let Y denote your prize. Find the pmf of Y. Solution: The pmf of Y is: y p Y (y)
11 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Definition The function F X : R [0, 1] (or simply F if no confusion is possible) defined by F X (x) = P(X x) = P X ((, x]) is called the (cumulative) distribution function of the rv X. Proposition F X determines the probability distribution, P X, of X. Proof: We have that P X is determined by its value P X ((a, b]) for any interval (a, b]. However, P X ((a, b]) is determined from F X by P X ((a, b]) = F X (b) F X (a).
12 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Consider a batch of size N = 10 products, 3 of which are defective. Draw 3 at random and without replacement and let the r.v. X denote the number of defectives. Find the cdf of X. Solution: x p(x) F (x) Moreover, F ( 1) = 0. F (1.5) = Also, p(1) = F (1) F (0)
13 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Consider a random variable X with cumulative distribution function given by F (x) = 0, for all x that are less than 1, F (x) = 0.4, for all x such that 1 x < 2, F (x) = 0.7, for all x such that 2 x < 3, F (x) = 0.9, for all x such that 3 x < 4, F (x) = 1, for all x that are greater than or equal to 4.
14 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Figure: The CDF of a Discrete Distribution is a Step or Jump Function
15 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Let X have cdf as shown above. Use the form of the cdf to deduce the distribution of X. Solution. Since its cdf is a jump function, we conclude that X is discrete with sample space the jump points of its cdf, i.e. 1,2,3, and 4. Finally, the probability with which X takes each value equals the size of the jump at that value (for example, P(X = 1) = 0.4). These deductions are justified as follows: a) P(X < 1) = 0 means that X cannot a value less than one. b) F (1) = 0.4, implies that P(X = 1) = 0.4. c) The second of the equations defining F also implies that P(1 < X < 2) = 0, and so on.
16 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Proposition (Properties of F X ) The distribution function F X, of a rv X satisfies: 1. lim x F X (x) = 0, and lim x F X (x) = F X (x) is an increasing (i.e. nondecreasing) function. 3. F X (x) is right continuous. Proof: See Section 4.10.
17 A unit is selected at random from a population of N units, and let the random variable X be the (numerical) value of a characteristic of interest. Let v 1, v 2,..., v N be the values of the characteristic of interest of each of the N units. Then, the expected value of X, denoted by µ X or E(X ) is defined by N E(X ) = 1 N i=1 v i Example 1. Let X denote the outcome of a roll of a die. Find E(X ). 2. Let X denote the outcome of a roll of a die that has the six on four sides and the number 8 on the other two sides. Find E(X ).
18 The expected value, E(X ) or µ X, of a discrete r.v. X having a possibly infinite sample space S X and pmf p(x) = P(X = x), for x S X, is defined as µ X = x in S X xp(x). Example Roll a die and let X denote the outcome. If X = 1 or 2, you win $1; if X = 3 you win $2, and if X 4 you win $4. Let Y denote your prize. Find E(Y ). Solution: The pmf of Y is: y p Y (y) Thus, E(Y ) = = 2.667
19 Example You are given a choice between accepting $ = or roll a die and win X 2. What will you choose and why? Solution: If the game will be played several times your decision will be based on the value of E(X 2 ). (Why?) To find this use = 91. Proposition Let X be a discrete r.v. taking values x i, i 1, having pmf p X. Then, E[g(X )] = i g(x i )p X (x i ).
20 Example A product that is stocked monthly yields a net profit of b dollars for each unit sold and a net loss of l dollars for each unit left unsold at the end of the month. The number monthly demand (i.e. # of units ordered) for this product at a particular store location during a given month is a rv having pmf p(k), k 0. If the store stocks s units, find the expected profit. Solution: Let X be the monthly demand. The random variable of interest here is the profit Y = g(x ) = bx (s X )l, if X s = bs, if X > s We want the expected profit, i.e. E(Y ).
21 Corollary For constants a, b we have Definition E(aX + b) = ae(x ) + b. The variance, σx 2 or Var(X ), and standard deviation, σ X or SD(X ), of a rv X are σx 2 = Var(X ) = E(X µ X ) 2, σ X = σx 2. Proposition Two common properties of the variance are 1. Var(X ) = E(X 2 ) µ 2 X 2. Var(aX + b) = a 2 Var(X )
22 The Bernoulli Random Variable A r.v. X is called Bernoulli if it takes only two values. The two values are referred to as success (S) and failure (F), or are recoded as 1 and 0. Thus, always, S X = {0, 1}. Experiments resulting in a Bernoulli r.v. are called Bernoulli. Example 1. A product is inspected. Set X = 1 if defective, X = 0 if nondefective. 2. A product is put to life test. Set X = 1 if it lasts more than 1000 hours, X = 0 otherwise. If P(X = 1) = p, we write X Bernoulli(p) to indicate that X is Bernoulli with probability of success p.
23 The Bernoulli Random Variable If X Bernoulli(p), then Its pmf is: x 0 1 p(x) 1 p p Its expected value is, E(X ) = p Its variance is, σ 2 X = p(1 p). (Why?)
24 The Binomial Random Variable A experiment consisting of n independent replications of a Bernoulli experiment is called a Binomial experiment. If X 1, X 2,..., X n are the Bernoulli r.v. for the n Bernoulli experiments, Y = n X i = the total number of 1s, i=1 is the Binomial r.v. Clearly S Y = {0, 1,..., n}. We write Y Bin(n, p) to indicate that Y is binomial with probability of success equal to p for each Bernoulli trial.
25 The Binomial Random Variable If Y Bin(n, p), then Its pmf is: ( n P(Y = k) = k ( ) ( ) ( ) n n 1 n Use x = n and x 2 = nx x x 1 x 1. Its expected value is E(Y ) = np 2. Its variance is σy 2 = np(1 p) ) p k (1 p) n k, k = 0, 1,..., n ( ) n 1 to get. x 1
26 Example A company sells screws in packages of 10 and offers a moneyback guarantee if two or more of the screws are defective. If a screws is defective with probability 0.01, independently of other screws, what proportion of the packages sold will the company replace? Solution: 1 P(X = 0) P(X = 1) = 0.004
27 Example Physical traits, such as eye color, are determined from a pair of genes, each of which can be either dominant (d) or recessive (r). One inherited from the mother and one from the father. Persons with genes (dd) (dr) and (rd) are alike in that physical trait. Assume that a child is equally likely to inherit either of the two genes from each parent. If both parents are hybrid with respect to a particular trait (i.e. both have pairs of genes (dr) or (rd)), find the probability that three of their four children will be hybrid with respect to this trait. Solution: Probability that an offspring of two hybrid parents is also hybrid is Thus, the desired probability is ( ) =
28 Example In order for the defendant to be convicted in a jury trial, at least eight of the twelve jurors must enter a guilty vote. Assume each juror makes the correct decision with probability 0.7 independently of other jurors. If 40% of defendants in such jury trials are innocent, what is the probability that the jury renders the correct verdict to a randomly selected defendant? Solution: Let B = {jury renders the correct verdict}, and A = {defendant is innocent}. Then, according to the Law of Total Probability, P(B) = P(B A)P(A) + P(B A c )P(A c ) = P(B A)0.4 + P(B A c )0.6.
29 Solution Continued: Next, let X denote the number of jurors who reach the correct verdict in a particular trial. Thus, X Bin(12, 0.7), and P(B A) = P(X 5) = 1 12 P(B A c ) = P(X 8) = k=8 4 k=0 ( 12 k ( ) k k = , k ) 0.7 k k = Thus, P(B) = P(B A)0.4 + P(B A c )0.6 =
30 Example A communications system consisting of n components works if at least half of its components work. Suppose it is possible to add components to the system, and that currently the system has n = 2k 1 components. 1. Show that by adding one component the system becomes more reliable for all integers k Show that this is not necessarily the case if we add two components to the system. Solution: 1. Let A n = {the system works when it has n components}. Then A 2k 1 = {k or more of the 2k 1 work} A 2k = A 2k 1 {k 1 of the original 2k 1 work, and the 2kth works}
31 Solution Continued: It follows that A 2k 1 A 2k. Thus, P(A 2k 1 ) P(A 2k ). 2. Using the same notation, A 2k+1 = {k + 1 or more of the original 2k 1 work} {k of the original 2k 1 work, and at least one of the 2kth and (2k + 1)th work} {k 1 of the original 2k 1 work, and both the 2kth and (2k + 1)th work}. It is seen that A 2k 1 is not a subset of A 2k+1, since, for example, A 2k 1 includes the outcome {k of the original 2k 1 work} but A 2k+1 does not. It is also clear that A 2k+1 is not a subset of A 2k 1. Thus, more information is needed to compare the reliability of the two systems.
32 Example (Example Continued) Suppose each component functions with probability p independently of the others. For what value of p is a (2k + 1)component system more reliable than a (2k 1)component system? Solution: Let X denote the number of the first 2k 1 that function. Then, P(A 2k 1 ) = P(X k) = P(X = k) + P(X k + 1) P(A 2k+1 ) = P(X k + 1) + P(X = k)(1 (1 p) 2 ) + P(X = k 1)p 2 and P(A 2k+1 ) P(A 2k 1 ) > 0 iff p > 0.5.
33 The Hypergeometric Random Variable The hypergeometric distribution arises when a simple random sample of size n is taken from a finite population of N units of which M are labeled 1 and the rest are labeled 0. The number X of units labeled 1 in the sample is a hypergeometric random variable with parameters n, M and N. This is denoted by X Hypergeo(n, M, N) If X Hypergeo(n, M, N), its pmf is ( M )( N M ) x n x P(X = x) = ( N n) Note that P(X = x) = 0 if x > M, or if n x > N M.
34 Applications of the Hypergeometric Distribution Read Example 8i, p.161. (LTP with hypergeometric)) Quality control is the primary use of the hypergeometric distribution. The following is an example of a different use. Example (The Capture/Recapture Method) This method is used to estimate the size N of a wildlife population. Suppose that 10 animals are captured, tagged and released. On a later occasion, 20 animals are captured. Let X be the number of recaptured animals. If all ( N 20) possible groups are equally likely, X is more likely to take small values if N is large. The precise form of the hypergeometric pmf can be used to estimate N from the value that X takes.
35 If X Hypergeo(n, M, N) then, Its expected value is: µ X = n M N Its variance is: σx 2 = n M ( 1 M ) N n N N N 1 N n N 1 is called finite population correction factor Binomial Approximation to Hypergeometric Probabilities If n 0.05 N, then P(X = x) P(Y = x), where Y Bin(n, p = M/N).
36 Example n = 10 electronic toys are selected at random from a batch delivery. We want the probability that 2 of the 10 will be defective. 1. If the batch has N = 20 toys with 5 of them defective, )( 15 ) 8 P(X = 2) = ( 5 2 ( 20 ) = Application of the binomial(n = 10, p = 0.25) approximation gives P(Y = 2) = If N = 100 and M = 25 (so p remains 0.25), we have )( 75 ) 8 P(X = 2) = ( 25 2 ( ) = 0.292, It is seen that the binomial probability of provides a better approximation when N is large.
37 Binomial Approximation to Hypergeometric Probabilities As N, M, with M/N p, and n remains fixed, ( M )( N M ) x n x ( N n) ( ) n p x (1 p) n x, x = 0, 1,..., n. x One way to show this is via Stirling s formula for approximating factorials: n! 2πn( n e )n, or more precisely n! = ( n ) n 2πn e λ n where e 1 12n + 1 < λ n < 1 12n Use this on the left hand side and note that the terms resulting from 2πn tend to 1, and powers of e cancel. Thus,
38 ( M )( N M ) x n x ( N n) ( ) n x M M (N M) N M (N n) N n N N (M x) M x (N M n + x) N M n+x M M (M x) M x = (1 + x M x )M x M x (N n) N n N N = (1 n N )N (N n) n (N M N M n x (N M n + x) N M n+x = (1 + N M n + x )N M n+x (N M) n
39 The Negative Binomial Random Variable In the negative binomial experiment, a Bernoulli experiment is repeated independently until the r th 1 is observed. For example, products are inspected, as they come off the assembly line, until the r th defective is found. The number, Y, of Bernoulli experiments until the rth 1 is observed is the negative binomial r.v. If p is the probability of 1 in a Bernoulli trial, we write Y NBin(r, p) If r = 1, Y is called the geometric r.v.
40 The Negative Binomial Random Variable If Y NBin(r, p), then Its pmf is: P(Y = y) = Its expected value is: ( ) y 1 p r (1 p) y r, y = r, r + 1,... r 1 E(Y ) = r p Its variance is: σ 2 Y = r(1 p) p 2
41 If r = 1 the Negative Binomial is called Geometric: P(X = x p) = p(1 p) x 1, x 1. The memoryless property: For integers s > t, Example P(X > s X > t) = P(X > s t) Independent Bernoulli trials are performed with probability of success p. Find the probability that r successes will occur before m failures. Solution: r successes will occur before m failures iff the rth success occurs no later than the (r + m 1)th trial. Hence the desired probability is found from r+m 1 k=r ( k 1 r 1 ) p r (1 p) k r
42 Example A candle is lit every evening at dinner time with a match taken from one of two match boxes. Assume each box is equally likely to be chosen and that initially both contained N matches. What is the probability that there are exactly k matches left, k = 0, 1,..., N, when one of the match boxes is first discovered empty? Solution: Let E be the event that box #1 is discovered empty and there are k matches in box #2. E will occur iff the (N + 1)th choice of box #1 is made at the (N N k)th trial. Thus, ( ) 2K k P(E) = 0.5 2N k+1, N and the desired probability is 2P(E).
43 Example Three electrical engineers toss coins to see who pays for coffee. If all three match, they toss another round. Otherwise the odd person pays for coffee. 1. Find the probability of a round of tossing resulting in a match. Answer: = Let Y be the number of times they toss coins until the odd person is determined. What is the distribution of Y? Answer: Geometric with p = Find P(Y 3). Answer: P(Y 3) = 1 P(Y = 1) P(Y = 2) = =
44 Poisson X Poisson(λ) if P(X = x λ) = e Use e t = i=0 t i i! EX = λ, σ 2 X = λ. to get. λ λx I (x {0, 1,...}). x! A recursion relation: P(X = x) = λ P(X = x 1), x 1. x This recursion relation, and that for the Binomial, can also be used to establish that if Y Bin(n, p) then as np λ P(Y = k) P(X = k). Poisson approximation to hypergeometric probabilities.
45 Proposition If Y Bin(n, p), with n 100, p 0.01, and np 20, then P(Y k) P(X k), k = 0, 1, 2,..., n, where X Poisson(λ = np). Example Due to a serious defect, n = 10, 000 cars are recalled. The probability that a car is defective is p = If Y is the number of defective cars, find: (a) P(Y 10), and (b) P(Y = 0). Solution. Let X Poisson(λ = np = 5). Then, (a) P(Y 10) P(X 10) = 1 P(X 9) = (b) P(Y = 0) P(X = 0) = e 5 =
46 The Poisson process When we record the number of occurrences as they accumulate with time, we denote X (t) = number of occurrences in the time interval [0, t]. Definition The number of occurrences as a function of time, X (t), t 0, is called a Poisson process, if the following assumptions are satisfied. 1. The probability of exactly one occurrence in a short time period of length t is approximately α t. 2. The probability of more than one occurrence in a short time period of length t is approximately The number of occurrences in t is independent of the number prior to this time.
47 The parameter α in the first assumption specifies the rate of the occurrences, i.e. the average number of occurrences per time unit. Because in an interval of length t 0 time units we would expect, on average, λ = α t 0 occurrences, we have the following Proposition Let X (t), t 0 be a Poisson process. 1. For each fixed t 0, X (t 0 ) Poisson(λ = α t 0 ). Thus, P(X (t 0 ) = k) = e αt (αt 0 0) k, k = 0, 1, 2, 2. If t 1 < t 2 are two positive numbers, k! then X (t 2 ) X (t 1 ) Poisson(α (t 2 t 1 ))
48 Example Continuous electrolytic inspection of a tin plate yields on average 0.2 imperfections per minute. Find: 1. The probability of one imperfection in three minutes 2. The probability of at most one imperfection in 0.25 hours. Solution. 1) Here α = 0.2, t = 3, λ = αt = 0.6. Thus, P(X (3) = 1) = F (1; λ = 0.6) F (0; λ = 0.6) = = ) Here α = 0.2, t = 15, λ = αt = 3.0. Thus, P(X (15) 1) = F ( 1; λ = 3.0 ) =.199.
Chapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a realvalued function defined on the sample space of some experiment. For instance,
More informationST 371 (IV): Discrete Random Variables
ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible
More informationRandom Variable: A function that assigns numerical values to all the outcomes in the sample space.
STAT 509 Section 3.2: Discrete Random Variables Random Variable: A function that assigns numerical values to all the outcomes in the sample space. Notation: Capital letters (like Y) denote a random variable.
More informationRandom variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.
Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()
More informationChapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution
Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 35, 36 Special discrete random variable distributions we will cover
More informationChapter 5. Discrete Probability Distributions
Chapter 5. Discrete Probability Distributions Chapter Problem: Did Mendel s result from plant hybridization experiments contradicts his theory? 1. Mendel s theory says that when there are two inheritable
More information3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.
3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately
More information5. Continuous Random Variables
5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be
More informationRandom variables, probability distributions, binomial random variable
Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationChapter 5. Random variables
Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like
More informationChapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 37, 38 The remaining discrete random
More informationStatistics 100A Homework 4 Solutions
Chapter 4 Statistics 00A Homework 4 Solutions Ryan Rosario 39. A ball is drawn from an urn containing 3 white and 3 black balls. After the ball is drawn, it is then replaced and another ball is drawn.
More informationSummary of Formulas and Concepts. Descriptive Statistics (Ch. 14)
Summary of Formulas and Concepts Descriptive Statistics (Ch. 14) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume
More informationRANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013. 3 k) ( 52 3 )
RANDOM VARIABLES MATH CIRCLE (ADVANCED) //0 0) a) Suppose you flip a fair coin times. i) What is the probability you get 0 heads? ii) head? iii) heads? iv) heads? For = 0,,,, P ( Heads) = ( ) b) Suppose
More informationWHERE DOES THE 10% CONDITION COME FROM?
1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay
More informationExample. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)
: Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest
More informationMAS108 Probability I
1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper
More informationLesson 5 Chapter 4: Jointly Distributed Random Variables
Lesson 5 Chapter 4: Jointly Distributed Random Variables Department of Statistics The Pennsylvania State University 1 Marginal and Conditional Probability Mass Functions The Regression Function Independence
More informationCommon probability distributionsi Math 217/218 Probability and Statistics Prof. D. Joyce, 2016
Introduction. ommon probability distributionsi Math 7/8 Probability and Statistics Prof. D. Joyce, 06 I summarize here some of the more common distributions used in probability and statistics. Some are
More informationMath 431 An Introduction to Probability. Final Exam Solutions
Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <
More information4. Continuous Random Variables, the Pareto and Normal Distributions
4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random
More informationBinomial distribution From Wikipedia, the free encyclopedia See also: Negative binomial distribution
Binomial distribution From Wikipedia, the free encyclopedia See also: Negative binomial distribution In probability theory and statistics, the binomial distribution is the discrete probability distribution
More informationLecture 6: Discrete & Continuous Probability and Random Variables
Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September
More informationCh5: Discrete Probability Distributions Section 51: Probability Distribution
Recall: Ch5: Discrete Probability Distributions Section 51: Probability Distribution A variable is a characteristic or attribute that can assume different values. o Various letters of the alphabet (e.g.
More informationDiscrete Random Variables and their Probability Distributions
CHAPTER 5 Discrete Random Variables and their Probability Distributions CHAPTER OUTLINE 5.1 Probability Distribution of a Discrete Random Variable 5.2 Mean and Standard Deviation of a Discrete Random Variable
More informationJoint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage
5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single
More informationChapters 5. Multivariate Probability Distributions
Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete
More informationWhat is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference
0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures
More informationECE316 Tutorial for the week of June 15
ECE316 Tutorial for the week of June 15 Problem 35 Page 176: refer to lecture notes part 2, slides 8, 15 A box contains 5 red and 5 blue marbles. Two marbles are withdrawn randomly. If they are the same
More informationLecture Notes 1. Brief Review of Basic Probability
Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters 3 are a review. I will assume you have read and understood Chapters 3. Here is a very
More informationMath 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304. jones/courses/141
Math 141 Lecture 7: Variance, Covariance, and Sums Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Last Time Variance: expected squared deviation from the mean: Standard
More informationQuestion: What is the probability that a fivecard poker hand contains a flush, that is, five cards of the same suit?
ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the
More informationFor a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )
Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (19031987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll
More informationDISCRETE RANDOM VARIABLES
DISCRETE RANDOM VARIABLES DISCRETE RANDOM VARIABLES Documents prepared for use in course B01.1305, New York University, Stern School of Business Definitions page 3 Discrete random variables are introduced
More informationMath 461 Fall 2006 Test 2 Solutions
Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two
More informationECE302 Spring 2006 HW3 Solutions February 2, 2006 1
ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in
More informationSome special discrete probability distributions
University of California, Los Angeles Department of Statistics Statistics 100A Instructor: Nicolas Christou Some special discrete probability distributions Bernoulli random variable: It is a variable that
More informationBasic Probability Concepts
page 1 Chapter 1 Basic Probability Concepts 1.1 Sample and Event Spaces 1.1.1 Sample Space A probabilistic (or statistical) experiment has the following characteristics: (a) the set of all possible outcomes
More informationPractice Problems #4
Practice Problems #4 PRACTICE PROBLEMS FOR HOMEWORK 4 (1) Read section 2.5 of the text. (2) Solve the practice problems below. (3) Open Homework Assignment #4, solve the problems, and submit multiplechoice
More informationContinuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.
UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Neda Farzinnia, UCLA Statistics University of California,
More informationJointly Distributed Random Variables
Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................
More informationIEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem
IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have
More informationJan 31 Homework Solutions Math 151, Winter 2012. Chapter 3 Problems (pages 102110)
Jan 31 Homework Solutions Math 151, Winter 01 Chapter 3 Problems (pages 10110) Problem 61 Genes relating to albinism are denoted by A and a. Only those people who receive the a gene from both parents
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 101634501 Probability and Statistics for Engineers Winter 20102011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete
More informationProbability Generating Functions
page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence
More informationDefinition: Suppose that two random variables, either continuous or discrete, X and Y have joint density
HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,
More informationNormal distribution Approximating binomial distribution by normal 2.10 Central Limit Theorem
1.1.2 Normal distribution 1.1.3 Approimating binomial distribution by normal 2.1 Central Limit Theorem Prof. Tesler Math 283 October 22, 214 Prof. Tesler 1.1.23, 2.1 Normal distribution Math 283 / October
More informationOverview of Monte Carlo Simulation, Probability Review and Introduction to Matlab
Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?
More informationThe sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].
Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real
More informationIntroduction to Probability
Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence
More informationSTAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE
STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about
More informationStatistics 100A Homework 4 Solutions
Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation
More informationYou flip a fair coin four times, what is the probability that you obtain three heads.
Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.
More informationBivariate Distributions
Chapter 4 Bivariate Distributions 4.1 Distributions of Two Random Variables In many practical cases it is desirable to take more than one measurement of a random observation: (brief examples) 1. What is
More informationDefinition and Calculus of Probability
In experiments with multivariate outcome variable, knowledge of the value of one variable may help predict another. For now, the word prediction will mean update the probabilities of events regarding the
More informationThe Binomial Distribution
The Binomial Distribution James H. Steiger November 10, 00 1 Topics for this Module 1. The Binomial Process. The Binomial Random Variable. The Binomial Distribution (a) Computing the Binomial pdf (b) Computing
More informationECE302 Spring 2006 HW4 Solutions February 6, 2006 1
ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in
More informationHomework 6 (due November 4, 2009)
Homework 6 (due November 4, 2009 Problem 1. On average, how many independent games of poker are required until a preassigned player is dealt a straight? Here we define a straight to be cards of consecutive
More informationSection 5.1 Continuous Random Variables: Introduction
Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,
More informationSTAT 3502. x 0 < x < 1
Solution  Assignment # STAT 350 Total mark=100 1. A large industrial firm purchases several new word processors at the end of each year, the exact number depending on the frequency of repairs in the previous
More informationNormal approximation to the Binomial
Chapter 5 Normal approximation to the Binomial 5.1 History In 1733, Abraham de Moivre presented an approximation to the Binomial distribution. He later (de Moivre, 1756, page 242 appended the derivation
More informationProbability distributions
Probability distributions (Notes are heavily adapted from Harnett, Ch. 3; Hayes, sections 2.142.19; see also Hayes, Appendix B.) I. Random variables (in general) A. So far we have focused on single events,
More informationChapter 4. Probability Distributions
Chapter 4 Probability Distributions Lesson 41/42 Random Variable Probability Distributions This chapter will deal the construction of probability distribution. By combining the methods of descriptive
More informationREPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.
REPEATED TRIALS Suppose you toss a fair coin one time. Let E be the event that the coin lands heads. We know from basic counting that p(e) = 1 since n(e) = 1 and 2 n(s) = 2. Now suppose we play a game
More informationP (A) = lim P (A) = N(A)/N,
1.1 Probability, Relative Frequency and Classical Definition. Probability is the study of random or nondeterministic experiments. Suppose an experiment can be repeated any number of times, so that we
More informationProbability OPRE 6301
Probability OPRE 6301 Random Experiment... Recall that our eventual goal in this course is to go from the random sample to the population. The theory that allows for this transition is the theory of probability.
More informationP (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )
Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note 11
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected
More informationProbability & Probability Distributions
Probability & Probability Distributions Carolyn J. Anderson EdPsych 580 Fall 2005 Probability & Probability Distributions p. 1/61 Probability & Probability Distributions Elementary Probability Theory Definitions
More informationMT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...
MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 20042012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................
More informationSection 6.1 Discrete Random variables Probability Distribution
Section 6.1 Discrete Random variables Probability Distribution Definitions a) Random variable is a variable whose values are determined by chance. b) Discrete Probability distribution consists of the values
More informationNormal distribution. ) 2 /2σ. 2π σ
Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a
More informationImportant Probability Distributions OPRE 6301
Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in reallife applications that they have been given their own names.
More informationm (t) = e nt m Y ( t) = e nt (pe t + q) n = (pe t e t + qe t ) n = (qe t + p) n
1. For a discrete random variable Y, prove that E[aY + b] = ae[y] + b and V(aY + b) = a 2 V(Y). Solution: E[aY + b] = E[aY] + E[b] = ae[y] + b where each step follows from a theorem on expected value from
More information4. Joint Distributions
Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose
More informatione.g. arrival of a customer to a service station or breakdown of a component in some system.
Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be
More information0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) =
. A mailorder computer business has si telephone lines. Let X denote the number of lines in use at a specified time. Suppose the pmf of X is as given in the accompanying table. 0 2 3 4 5 6 p(.0.5.20.25.20.06.04
More informationA crash course in probability and Naïve Bayes classification
Probability theory A crash course in probability and Naïve Bayes classification Chapter 9 Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. s: A person s
More informationSection 65 Sample Spaces and Probability
492 6 SEQUENCES, SERIES, AND PROBABILITY 52. How many committees of 4 people are possible from a group of 9 people if (A) There are no restrictions? (B) Both Juan and Mary must be on the committee? (C)
More informationBinomial lattice model for stock prices
Copyright c 2007 by Karl Sigman Binomial lattice model for stock prices Here we model the price of a stock in discrete time by a Markov chain of the recursive form S n+ S n Y n+, n 0, where the {Y i }
More informationMath 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions
Math 70/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 2 Solutions About this problem set: These are problems from Course /P actuarial exams that I have collected over the years,
More informationSums of Independent Random Variables
Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables
More informationUniversity of California, Los Angeles Department of Statistics. Random variables
University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.
More informationBinomial Distribution
Introductory Statistics Lectures Binomial Distribution Finding the probability of successes in n trials. Department of Mathematics Pima Community College Redistribution of this material is prohibited without
More informationFeb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172179)
Feb 7 Homework Solutions Math 151, Winter 2012 Chapter Problems (pages 172179) Problem 3 Three dice are rolled. By assuming that each of the 6 3 216 possible outcomes is equally likely, find the probabilities
More informationPROBABILITIES AND PROBABILITY DISTRIBUTIONS
Published in "Random Walks in Biology", 1983, Princeton University Press PROBABILITIES AND PROBABILITY DISTRIBUTIONS Howard C. Berg Table of Contents PROBABILITIES PROBABILITY DISTRIBUTIONS THE BINOMIAL
More informationSufficient Statistics and Exponential Family. 1 Statistics and Sufficient Statistics. Math 541: Statistical Theory II. Lecturer: Songfeng Zheng
Math 541: Statistical Theory II Lecturer: Songfeng Zheng Sufficient Statistics and Exponential Family 1 Statistics and Sufficient Statistics Suppose we have a random sample X 1,, X n taken from a distribution
More informationCombinatorics: The Fine Art of Counting
Combinatorics: The Fine Art of Counting Week 7 Lecture Notes Discrete Probability Continued Note Binomial coefficients are written horizontally. The symbol ~ is used to mean approximately equal. The Bernoulli
More informationReview the following from Chapter 5
Bluman, Chapter 6 1 Review the following from Chapter 5 A surgical procedure has an 85% chance of success and a doctor performs the procedure on 10 patients, find the following: a) The probability that
More informationUNIT I: RANDOM VARIABLES PART A TWO MARKS
UNIT I: RANDOM VARIABLES PART A TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1x) 0
More informationChapter 3: The basic concepts of probability
Chapter 3: The basic concepts of probability Experiment: a measurement process that produces quantifiable results (e.g. throwing two dice, dealing cards, at poker, measuring heights of people, recording
More information6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:309:30 PM. SOLUTIONS
6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:39:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total
More informationA Tutorial on Probability Theory
Paola Sebastiani Department of Mathematics and Statistics University of Massachusetts at Amherst Corresponding Author: Paola Sebastiani. Department of Mathematics and Statistics, University of Massachusetts,
More informationUnit 21: Binomial Distributions
Unit 21: Binomial Distributions Summary of Video In Unit 20, we learned that in the world of random phenomena, probability models provide us with a list of all possible outcomes and probabilities for how
More informationExamination 110 Probability and Statistics Examination
Examination 0 Probability and Statistics Examination Sample Examination Questions The Probability and Statistics Examination consists of 5 multiplechoice test questions. The test is a threehour examination
More informationLecture 2 Binomial and Poisson Probability Distributions
Lecture 2 Binomial and Poisson Probability Distributions Binomial Probability Distribution l Consider a situation where there are only two possible outcomes (a Bernoulli trial) H Example: u flipping a
More informationThe Basics of Financial Mathematics. Spring Richard F. Bass Department of Mathematics University of Connecticut
The Basics of Financial Mathematics Spring 23 Richard F. Bass Department of Mathematics University of Connecticut These notes are c 23 by Richard Bass. They may be used for personal use or class use, but
More information