Random Variables and Their Expected Values

Size: px
Start display at page:

Download "Random Variables and Their Expected Values"

Transcription

1

2 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function

3 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Definition Let S be the sample space of some probabilistic experiment. A function X : S R is called a random variable. Example 1. A unit is selected at random from a population of units. Thus S is the collection of units in the population. Suppose a characteristic (weight, volume, or opinion on a certain matter) is recorded. A numerical description of the outcome is a random variable. 2. S = {s = (x 1..., x n ) : x i R, i}, X (s) = i x i or X (s) = x, or X (s) = max{x 1..., x n }. 3. S = {s : 0 s < } (e.g. we may be recording the life time of an electrical component), X (s) = I (s > 1500), or X (s) = s, or X (s) = log(s).

4 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function A random variable X induces a probability measure on the range of its values, which is denoted by X (S). X (S) can be thought of as the sample space of a compound experiment which consists of the original experiment, and the subsequent transformation of the outcome into a numerical value. Because the value X (s) of the random variable X is determined from the outcome s, we may assign probabilities to the possible values of X. For example, if a die is rolled and we define X (s) = 1 for s = 1, 2, 3, 4, and X (s) = 0 for s = 5, 6, then P(X = 1) = 4/6, P(X = 0) = 2/6. The probability measure P X, induced on X (S) by the random variable X, is called the (probability) distribution of X.

5 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function The distribution of a random variable is considered known if the probabilities P X ((a, b]) = P(a < X b) are known for all a < b. Definition A random variable X is called discrete if X (S) is a finite or a countably infinite set. If X (S) is uncountably infinite, then X is called continuous. For discrete r.v. s X, P X is completely specified by the probabilities P X ({k}) = P(X = k), for each k X (S). (See part 4 of Theorem on p. 15 of 2nd lecture slide.) The function p(x) = P(X = x) is called the probability mass function of X.

6 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Consider a batch of size N = 10 products, 3 of which are defective. Draw 3 at random and without replacement and let the r.v. X denote the number of defective items. Find the pmf of X. Solution: The sample space of X is S X = {0, 1, 2, 3}, and: ( 7 )( 3 3) 1) P(X = 0) = ( 10 3 ), P(X = 1) = ( 7 2 ( 10 3 ), P(X = 2) = ( 7 )( 3 1 ( 2) 10 ), P(X = 3) = 3 ( 3 3) ( 10 3 ) Thus, the pmf of X is x p(x)

7 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Bar Graph Probability Mass Function x values

8 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Let S = {s = (x 1..., x n ) : x i = 0 or 1}, i} (so S is the sample space of n flips of a coin), and let p be the probability of 1 (i.e. probability of heads). Then, the probability measure on S is P({(x 1..., x n )}) = p P i x i (1 p) n P i x i. Let X (s) = i x i, so X (S) = {0, 1,..., n}. Then, the distribution, P X, of X, which is a probability measure on X (S), is called the Binomial distribution. Suppose n = 3 and p = 1/2. Then, P X (0) = P(X = 0) = 1 8, P X (1) = P(X = 1) = 3 8 why?, P X (2) = P(X = 2) = 3 8, P X (3) = P(X = 3) = 1 8.

9 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Three balls are randomly selected at random and without replacement from an urn containing 20 balls numbered 1 through 20. Find the probability that at least one of the balls will have number 17. Solution: Here S = {s = (i 1, i 2, i 3 ) : 1 i 1, i 2, i 3 20}, X (s) = max{i 1, i 2, i 3 }, X (S) = {3, 4,..., 20} and we want to find P(X 17) = P(X = 17) + P(X = 18) + P(X = 19) + P(X = 20). These are found from the formula ) P(X = k) = ( k 1 2 ( 20 3 ) (why?) The end result is P(X 17) =

10 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function The PMF of a Function of X Let X be a discrete random variable with range (i.e. set of possible values) X and distribution P X, and let Y = g(x ) be a function of X with range Y. Then the pmf p Y (y) of Y is given in terms of the pmf p X (x) of X by p Y (y) = x X :g(x)=y p X (x), for all y Y. Example Roll a die and let X denote the outcome. If X = 1 or 2, you win $1; if X = 3 you win $2, and if X 4 you win $4. Let Y denote your prize. Find the pmf of Y. Solution: The pmf of Y is: y p Y (y)

11 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Definition The function F X : R [0, 1] (or simply F if no confusion is possible) defined by F X (x) = P(X x) = P X ((, x]) is called the (cumulative) distribution function of the rv X. Proposition F X determines the probability distribution, P X, of X. Proof: We have that P X is determined by its value P X ((a, b]) for any interval (a, b]. However, P X ((a, b]) is determined from F X by P X ((a, b]) = F X (b) F X (a).

12 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Consider a batch of size N = 10 products, 3 of which are defective. Draw 3 at random and without replacement and let the r.v. X denote the number of defectives. Find the cdf of X. Solution: x p(x) F (x) Moreover, F ( 1) = 0. F (1.5) = Also, p(1) = F (1) F (0)

13 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Consider a random variable X with cumulative distribution function given by F (x) = 0, for all x that are less than 1, F (x) = 0.4, for all x such that 1 x < 2, F (x) = 0.7, for all x such that 2 x < 3, F (x) = 0.9, for all x such that 3 x < 4, F (x) = 1, for all x that are greater than or equal to 4.

14 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Figure: The CDF of a Discrete Distribution is a Step or Jump Function

15 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Example Let X have cdf as shown above. Use the form of the cdf to deduce the distribution of X. Solution. Since its cdf is a jump function, we conclude that X is discrete with sample space the jump points of its cdf, i.e. 1,2,3, and 4. Finally, the probability with which X takes each value equals the size of the jump at that value (for example, P(X = 1) = 0.4). These deductions are justified as follows: a) P(X < 1) = 0 means that X cannot a value less than one. b) F (1) = 0.4, implies that P(X = 1) = 0.4. c) The second of the equations defining F also implies that P(1 < X < 2) = 0, and so on.

16 Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Proposition (Properties of F X ) The distribution function F X, of a rv X satisfies: 1. lim x F X (x) = 0, and lim x F X (x) = F X (x) is an increasing (i.e. non-decreasing) function. 3. F X (x) is right continuous. Proof: See Section 4.10.

17 A unit is selected at random from a population of N units, and let the random variable X be the (numerical) value of a characteristic of interest. Let v 1, v 2,..., v N be the values of the characteristic of interest of each of the N units. Then, the expected value of X, denoted by µ X or E(X ) is defined by N E(X ) = 1 N i=1 v i Example 1. Let X denote the outcome of a roll of a die. Find E(X ). 2. Let X denote the outcome of a roll of a die that has the six on four sides and the number 8 on the other two sides. Find E(X ).

18 The expected value, E(X ) or µ X, of a discrete r.v. X having a possibly infinite sample space S X and pmf p(x) = P(X = x), for x S X, is defined as µ X = x in S X xp(x). Example Roll a die and let X denote the outcome. If X = 1 or 2, you win $1; if X = 3 you win $2, and if X 4 you win $4. Let Y denote your prize. Find E(Y ). Solution: The pmf of Y is: y p Y (y) Thus, E(Y ) = = 2.667

19 Example You are given a choice between accepting $ = or roll a die and win X 2. What will you choose and why? Solution: If the game will be played several times your decision will be based on the value of E(X 2 ). (Why?) To find this use = 91. Proposition Let X be a discrete r.v. taking values x i, i 1, having pmf p X. Then, E[g(X )] = i g(x i )p X (x i ).

20 Example A product that is stocked monthly yields a net profit of b dollars for each unit sold and a net loss of l dollars for each unit left unsold at the end of the month. The number monthly demand (i.e. # of units ordered) for this product at a particular store location during a given month is a rv having pmf p(k), k 0. If the store stocks s units, find the expected profit. Solution: Let X be the monthly demand. The random variable of interest here is the profit Y = g(x ) = bx (s X )l, if X s = bs, if X > s We want the expected profit, i.e. E(Y ).

21 Corollary For constants a, b we have Definition E(aX + b) = ae(x ) + b. The variance, σx 2 or Var(X ), and standard deviation, σ X or SD(X ), of a rv X are σx 2 = Var(X ) = E(X µ X ) 2, σ X = σx 2. Proposition Two common properties of the variance are 1. Var(X ) = E(X 2 ) µ 2 X 2. Var(aX + b) = a 2 Var(X )

22 The Bernoulli Random Variable A r.v. X is called Bernoulli if it takes only two values. The two values are referred to as success (S) and failure (F), or are re-coded as 1 and 0. Thus, always, S X = {0, 1}. Experiments resulting in a Bernoulli r.v. are called Bernoulli. Example 1. A product is inspected. Set X = 1 if defective, X = 0 if non-defective. 2. A product is put to life test. Set X = 1 if it lasts more than 1000 hours, X = 0 otherwise. If P(X = 1) = p, we write X Bernoulli(p) to indicate that X is Bernoulli with probability of success p.

23 The Bernoulli Random Variable If X Bernoulli(p), then Its pmf is: x 0 1 p(x) 1 p p Its expected value is, E(X ) = p Its variance is, σ 2 X = p(1 p). (Why?)

24 The Binomial Random Variable A experiment consisting of n independent replications of a Bernoulli experiment is called a Binomial experiment. If X 1, X 2,..., X n are the Bernoulli r.v. for the n Bernoulli experiments, Y = n X i = the total number of 1s, i=1 is the Binomial r.v. Clearly S Y = {0, 1,..., n}. We write Y Bin(n, p) to indicate that Y is binomial with probability of success equal to p for each Bernoulli trial.

25 The Binomial Random Variable If Y Bin(n, p), then Its pmf is: ( n P(Y = k) = k ( ) ( ) ( ) n n 1 n Use x = n and x 2 = nx x x 1 x 1. Its expected value is E(Y ) = np 2. Its variance is σy 2 = np(1 p) ) p k (1 p) n k, k = 0, 1,..., n ( ) n 1 to get. x 1

26 Example A company sells screws in packages of 10 and offers a money-back guarantee if two or more of the screws are defective. If a screws is defective with probability 0.01, independently of other screws, what proportion of the packages sold will the company replace? Solution: 1 P(X = 0) P(X = 1) = 0.004

27 Example Physical traits, such as eye color, are determined from a pair of genes, each of which can be either dominant (d) or recessive (r). One inherited from the mother and one from the father. Persons with genes (dd) (dr) and (rd) are alike in that physical trait. Assume that a child is equally likely to inherit either of the two genes from each parent. If both parents are hybrid with respect to a particular trait (i.e. both have pairs of genes (dr) or (rd)), find the probability that three of their four children will be hybrid with respect to this trait. Solution: Probability that an offspring of two hybrid parents is also hybrid is Thus, the desired probability is ( ) =

28 Example In order for the defendant to be convicted in a jury trial, at least eight of the twelve jurors must enter a guilty vote. Assume each juror makes the correct decision with probability 0.7 independently of other jurors. If 40% of defendants in such jury trials are innocent, what is the probability that the jury renders the correct verdict to a randomly selected defendant? Solution: Let B = {jury renders the correct verdict}, and A = {defendant is innocent}. Then, according to the Law of Total Probability, P(B) = P(B A)P(A) + P(B A c )P(A c ) = P(B A)0.4 + P(B A c )0.6.

29 Solution Continued: Next, let X denote the number of jurors who reach the correct verdict in a particular trial. Thus, X Bin(12, 0.7), and P(B A) = P(X 5) = 1 12 P(B A c ) = P(X 8) = k=8 4 k=0 ( 12 k ( ) k k = , k ) 0.7 k k = Thus, P(B) = P(B A)0.4 + P(B A c )0.6 =

30 Example A communications system consisting of n components works if at least half of its components work. Suppose it is possible to add components to the system, and that currently the system has n = 2k 1 components. 1. Show that by adding one component the system becomes more reliable for all integers k Show that this is not necessarily the case if we add two components to the system. Solution: 1. Let A n = {the system works when it has n components}. Then A 2k 1 = {k or more of the 2k 1 work} A 2k = A 2k 1 {k 1 of the original 2k 1 work, and the 2kth works}

31 Solution Continued: It follows that A 2k 1 A 2k. Thus, P(A 2k 1 ) P(A 2k ). 2. Using the same notation, A 2k+1 = {k + 1 or more of the original 2k 1 work} {k of the original 2k 1 work, and at least one of the 2kth and (2k + 1)th work} {k 1 of the original 2k 1 work, and both the 2kth and (2k + 1)th work}. It is seen that A 2k 1 is not a subset of A 2k+1, since, for example, A 2k 1 includes the outcome {k of the original 2k 1 work} but A 2k+1 does not. It is also clear that A 2k+1 is not a subset of A 2k 1. Thus, more information is needed to compare the reliability of the two systems.

32 Example (Example Continued) Suppose each component functions with probability p independently of the others. For what value of p is a (2k + 1)-component system more reliable than a (2k 1)-component system? Solution: Let X denote the number of the first 2k 1 that function. Then, P(A 2k 1 ) = P(X k) = P(X = k) + P(X k + 1) P(A 2k+1 ) = P(X k + 1) + P(X = k)(1 (1 p) 2 ) + P(X = k 1)p 2 and P(A 2k+1 ) P(A 2k 1 ) > 0 iff p > 0.5.

33 The Hypergeometric Random Variable The hypergeometric distribution arises when a simple random sample of size n is taken from a finite population of N units of which M are labeled 1 and the rest are labeled 0. The number X of units labeled 1 in the sample is a hypergeometric random variable with parameters n, M and N. This is denoted by X Hypergeo(n, M, N) If X Hypergeo(n, M, N), its pmf is ( M )( N M ) x n x P(X = x) = ( N n) Note that P(X = x) = 0 if x > M, or if n x > N M.

34 Applications of the Hypergeometric Distribution Read Example 8i, p.161. (LTP with hypergeometric)) Quality control is the primary use of the hypergeometric distribution. The following is an example of a different use. Example (The Capture/Recapture Method) This method is used to estimate the size N of a wildlife population. Suppose that 10 animals are captured, tagged and released. On a later occasion, 20 animals are captured. Let X be the number of recaptured animals. If all ( N 20) possible groups are equally likely, X is more likely to take small values if N is large. The precise form of the hypergeometric pmf can be used to estimate N from the value that X takes.

35 If X Hypergeo(n, M, N) then, Its expected value is: µ X = n M N Its variance is: σx 2 = n M ( 1 M ) N n N N N 1 N n N 1 is called finite population correction factor Binomial Approximation to Hypergeometric Probabilities If n 0.05 N, then P(X = x) P(Y = x), where Y Bin(n, p = M/N).

36 Example n = 10 electronic toys are selected at random from a batch delivery. We want the probability that 2 of the 10 will be defective. 1. If the batch has N = 20 toys with 5 of them defective, )( 15 ) 8 P(X = 2) = ( 5 2 ( 20 ) = Application of the binomial(n = 10, p = 0.25) approximation gives P(Y = 2) = If N = 100 and M = 25 (so p remains 0.25), we have )( 75 ) 8 P(X = 2) = ( 25 2 ( ) = 0.292, It is seen that the binomial probability of provides a better approximation when N is large.

37 Binomial Approximation to Hypergeometric Probabilities As N, M, with M/N p, and n remains fixed, ( M )( N M ) x n x ( N n) ( ) n p x (1 p) n x, x = 0, 1,..., n. x One way to show this is via Stirling s formula for approximating factorials: n! 2πn( n e )n, or more precisely n! = ( n ) n 2πn e λ n where e 1 12n + 1 < λ n < 1 12n Use this on the left hand side and note that the terms resulting from 2πn tend to 1, and powers of e cancel. Thus,

38 ( M )( N M ) x n x ( N n) ( ) n x M M (N M) N M (N n) N n N N (M x) M x (N M n + x) N M n+x M M (M x) M x = (1 + x M x )M x M x (N n) N n N N = (1 n N )N (N n) n (N M N M n x (N M n + x) N M n+x = (1 + N M n + x )N M n+x (N M) n

39 The Negative Binomial Random Variable In the negative binomial experiment, a Bernoulli experiment is repeated independently until the r th 1 is observed. For example, products are inspected, as they come off the assembly line, until the r th defective is found. The number, Y, of Bernoulli experiments until the rth 1 is observed is the negative binomial r.v. If p is the probability of 1 in a Bernoulli trial, we write Y NBin(r, p) If r = 1, Y is called the geometric r.v.

40 The Negative Binomial Random Variable If Y NBin(r, p), then Its pmf is: P(Y = y) = Its expected value is: ( ) y 1 p r (1 p) y r, y = r, r + 1,... r 1 E(Y ) = r p Its variance is: σ 2 Y = r(1 p) p 2

41 If r = 1 the Negative Binomial is called Geometric: P(X = x p) = p(1 p) x 1, x 1. The memoryless property: For integers s > t, Example P(X > s X > t) = P(X > s t) Independent Bernoulli trials are performed with probability of success p. Find the probability that r successes will occur before m failures. Solution: r successes will occur before m failures iff the rth success occurs no later than the (r + m 1)th trial. Hence the desired probability is found from r+m 1 k=r ( k 1 r 1 ) p r (1 p) k r

42 Example A candle is lit every evening at dinner time with a match taken from one of two match boxes. Assume each box is equally likely to be chosen and that initially both contained N matches. What is the probability that there are exactly k matches left, k = 0, 1,..., N, when one of the match boxes is first discovered empty? Solution: Let E be the event that box #1 is discovered empty and there are k matches in box #2. E will occur iff the (N + 1)th choice of box #1 is made at the (N N k)th trial. Thus, ( ) 2K k P(E) = 0.5 2N k+1, N and the desired probability is 2P(E).

43 Example Three electrical engineers toss coins to see who pays for coffee. If all three match, they toss another round. Otherwise the odd person pays for coffee. 1. Find the probability of a round of tossing resulting in a match. Answer: = Let Y be the number of times they toss coins until the odd person is determined. What is the distribution of Y? Answer: Geometric with p = Find P(Y 3). Answer: P(Y 3) = 1 P(Y = 1) P(Y = 2) = =

44 Poisson X Poisson(λ) if P(X = x λ) = e Use e t = i=0 t i i! EX = λ, σ 2 X = λ. to get. λ λx I (x {0, 1,...}). x! A recursion relation: P(X = x) = λ P(X = x 1), x 1. x This recursion relation, and that for the Binomial, can also be used to establish that if Y Bin(n, p) then as np λ P(Y = k) P(X = k). Poisson approximation to hypergeometric probabilities.

45 Proposition If Y Bin(n, p), with n 100, p 0.01, and np 20, then P(Y k) P(X k), k = 0, 1, 2,..., n, where X Poisson(λ = np). Example Due to a serious defect, n = 10, 000 cars are recalled. The probability that a car is defective is p = If Y is the number of defective cars, find: (a) P(Y 10), and (b) P(Y = 0). Solution. Let X Poisson(λ = np = 5). Then, (a) P(Y 10) P(X 10) = 1 P(X 9) = (b) P(Y = 0) P(X = 0) = e 5 =

46 The Poisson process When we record the number of occurrences as they accumulate with time, we denote X (t) = number of occurrences in the time interval [0, t]. Definition The number of occurrences as a function of time, X (t), t 0, is called a Poisson process, if the following assumptions are satisfied. 1. The probability of exactly one occurrence in a short time period of length t is approximately α t. 2. The probability of more than one occurrence in a short time period of length t is approximately The number of occurrences in t is independent of the number prior to this time.

47 The parameter α in the first assumption specifies the rate of the occurrences, i.e. the average number of occurrences per time unit. Because in an interval of length t 0 time units we would expect, on average, λ = α t 0 occurrences, we have the following Proposition Let X (t), t 0 be a Poisson process. 1. For each fixed t 0, X (t 0 ) Poisson(λ = α t 0 ). Thus, P(X (t 0 ) = k) = e αt (αt 0 0) k, k = 0, 1, 2, 2. If t 1 < t 2 are two positive numbers, k! then X (t 2 ) X (t 1 ) Poisson(α (t 2 t 1 ))

48 Example Continuous electrolytic inspection of a tin plate yields on average 0.2 imperfections per minute. Find: 1. The probability of one imperfection in three minutes 2. The probability of at most one imperfection in 0.25 hours. Solution. 1) Here α = 0.2, t = 3, λ = αt = 0.6. Thus, P(X (3) = 1) = F (1; λ = 0.6) F (0; λ = 0.6) = = ) Here α = 0.2, t = 15, λ = αt = 3.0. Thus, P(X (15) 1) = F ( 1; λ = 3.0 ) =.199.

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved. 3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately

More information

Chapter 5. Discrete Probability Distributions

Chapter 5. Discrete Probability Distributions Chapter 5. Discrete Probability Distributions Chapter Problem: Did Mendel s result from plant hybridization experiments contradicts his theory? 1. Mendel s theory says that when there are two inheritable

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

Chapter 5. Random variables

Chapter 5. Random variables Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Chapter 4 Statistics 00A Homework 4 Solutions Ryan Rosario 39. A ball is drawn from an urn containing 3 white and 3 black balls. After the ball is drawn, it is then replaced and another ball is drawn.

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

WHERE DOES THE 10% CONDITION COME FROM?

WHERE DOES THE 10% CONDITION COME FROM? 1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

More information

MAS108 Probability I

MAS108 Probability I 1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

More information

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away) : Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest

More information

4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions 4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution Recall: Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution A variable is a characteristic or attribute that can assume different values. o Various letters of the alphabet (e.g.

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

Basic Probability Concepts

Basic Probability Concepts page 1 Chapter 1 Basic Probability Concepts 1.1 Sample and Event Spaces 1.1.1 Sample Space A probabilistic (or statistical) experiment has the following characteristics: (a) the set of all possible outcomes

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit? ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

More information

ECE-316 Tutorial for the week of June 1-5

ECE-316 Tutorial for the week of June 1-5 ECE-316 Tutorial for the week of June 1-5 Problem 35 Page 176: refer to lecture notes part 2, slides 8, 15 A box contains 5 red and 5 blue marbles. Two marbles are withdrawn randomly. If they are the same

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Practice Problems #4

Practice Problems #4 Practice Problems #4 PRACTICE PROBLEMS FOR HOMEWORK 4 (1) Read section 2.5 of the text. (2) Solve the practice problems below. (3) Open Homework Assignment #4, solve the problems, and submit multiple-choice

More information

Some special discrete probability distributions

Some special discrete probability distributions University of California, Los Angeles Department of Statistics Statistics 100A Instructor: Nicolas Christou Some special discrete probability distributions Bernoulli random variable: It is a variable that

More information

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Definition and Calculus of Probability

Definition and Calculus of Probability In experiments with multivariate outcome variable, knowledge of the value of one variable may help predict another. For now, the word prediction will mean update the probabilities of events regarding the

More information

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

The Binomial Distribution

The Binomial Distribution The Binomial Distribution James H. Steiger November 10, 00 1 Topics for this Module 1. The Binomial Process. The Binomial Random Variable. The Binomial Distribution (a) Computing the Binomial pdf (b) Computing

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

You flip a fair coin four times, what is the probability that you obtain three heads.

You flip a fair coin four times, what is the probability that you obtain three heads. Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k. REPEATED TRIALS Suppose you toss a fair coin one time. Let E be the event that the coin lands heads. We know from basic counting that p(e) = 1 since n(e) = 1 and 2 n(s) = 2. Now suppose we play a game

More information

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

STAT 3502. x 0 < x < 1

STAT 3502. x 0 < x < 1 Solution - Assignment # STAT 350 Total mark=100 1. A large industrial firm purchases several new word processors at the end of each year, the exact number depending on the frequency of repairs in the previous

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Chapter 4. Probability Distributions

Chapter 4. Probability Distributions Chapter 4 Probability Distributions Lesson 4-1/4-2 Random Variable Probability Distributions This chapter will deal the construction of probability distribution. By combining the methods of descriptive

More information

Normal distribution. ) 2 /2σ. 2π σ

Normal distribution. ) 2 /2σ. 2π σ Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

More information

Section 6-5 Sample Spaces and Probability

Section 6-5 Sample Spaces and Probability 492 6 SEQUENCES, SERIES, AND PROBABILITY 52. How many committees of 4 people are possible from a group of 9 people if (A) There are no restrictions? (B) Both Juan and Mary must be on the committee? (C)

More information

Binomial lattice model for stock prices

Binomial lattice model for stock prices Copyright c 2007 by Karl Sigman Binomial lattice model for stock prices Here we model the price of a stock in discrete time by a Markov chain of the recursive form S n+ S n Y n+, n 0, where the {Y i }

More information

Probability & Probability Distributions

Probability & Probability Distributions Probability & Probability Distributions Carolyn J. Anderson EdPsych 580 Fall 2005 Probability & Probability Distributions p. 1/61 Probability & Probability Distributions Elementary Probability Theory Definitions

More information

Section 6.1 Discrete Random variables Probability Distribution

Section 6.1 Discrete Random variables Probability Distribution Section 6.1 Discrete Random variables Probability Distribution Definitions a) Random variable is a variable whose values are determined by chance. b) Discrete Probability distribution consists of the values

More information

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

Feb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179)

Feb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179) Feb 7 Homework Solutions Math 151, Winter 2012 Chapter Problems (pages 172-179) Problem 3 Three dice are rolled. By assuming that each of the 6 3 216 possible outcomes is equally likely, find the probabilities

More information

Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected

More information

Sums of Independent Random Variables

Sums of Independent Random Variables Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

More information

University of California, Los Angeles Department of Statistics. Random variables

University of California, Los Angeles Department of Statistics. Random variables University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.

More information

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

More information

Lecture 2 Binomial and Poisson Probability Distributions

Lecture 2 Binomial and Poisson Probability Distributions Lecture 2 Binomial and Poisson Probability Distributions Binomial Probability Distribution l Consider a situation where there are only two possible outcomes (a Bernoulli trial) H Example: u flipping a

More information

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) =

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) = . A mail-order computer business has si telephone lines. Let X denote the number of lines in use at a specified time. Suppose the pmf of X is as given in the accompanying table. 0 2 3 4 5 6 p(.0.5.20.25.20.06.04

More information

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions Math 70/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 2 Solutions About this problem set: These are problems from Course /P actuarial exams that I have collected over the years,

More information

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0

More information

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of sample space, event and probability function. 2. Be able to

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

A Tutorial on Probability Theory

A Tutorial on Probability Theory Paola Sebastiani Department of Mathematics and Statistics University of Massachusetts at Amherst Corresponding Author: Paola Sebastiani. Department of Mathematics and Statistics, University of Massachusetts,

More information

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University

Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University 1 Chapter 1 Probability 1.1 Basic Concepts In the study of statistics, we consider experiments

More information

Section 5-3 Binomial Probability Distributions

Section 5-3 Binomial Probability Distributions Section 5-3 Binomial Probability Distributions Key Concept This section presents a basic definition of a binomial distribution along with notation, and methods for finding probability values. Binomial

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2014 1 2 3 4 5 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Experiment, outcome, sample space, and

More information

2 Binomial, Poisson, Normal Distribution

2 Binomial, Poisson, Normal Distribution 2 Binomial, Poisson, Normal Distribution Binomial Distribution ): We are interested in the number of times an event A occurs in n independent trials. In each trial the event A has the same probability

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

Sample Questions for Mastery #5

Sample Questions for Mastery #5 Name: Class: Date: Sample Questions for Mastery #5 Multiple Choice Identify the choice that best completes the statement or answers the question.. For which of the following binomial experiments could

More information

Statistics 100A Homework 3 Solutions

Statistics 100A Homework 3 Solutions Chapter Statistics 00A Homework Solutions Ryan Rosario. Two balls are chosen randomly from an urn containing 8 white, black, and orange balls. Suppose that we win $ for each black ball selected and we

More information

Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

More information

The Binomial Distribution. Summer 2003

The Binomial Distribution. Summer 2003 The Binomial Distribution Summer 2003 Internet Bubble Several industry experts believe that 30% of internet companies will run out of cash in 6 months and that these companies will find it very hard to

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

Unit 4 The Bernoulli and Binomial Distributions

Unit 4 The Bernoulli and Binomial Distributions PubHlth 540 4. Bernoulli and Binomial Page 1 of 19 Unit 4 The Bernoulli and Binomial Distributions Topic 1. Review What is a Discrete Probability Distribution... 2. Statistical Expectation.. 3. The Population

More information

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability Math/Stats 425 Introduction to Probability 1. Uncertainty and the axioms of probability Processes in the real world are random if outcomes cannot be predicted with certainty. Example: coin tossing, stock

More information

Business Statistics 41000: Probability 1

Business Statistics 41000: Probability 1 Business Statistics 41000: Probability 1 Drew D. Creal University of Chicago, Booth School of Business Week 3: January 24 and 25, 2014 1 Class information Drew D. Creal Email: dcreal@chicagobooth.edu Office:

More information

2. Discrete random variables

2. Discrete random variables 2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be

More information

Section 1.3 P 1 = 1 2. = 1 4 2 8. P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., = 1 2 4.

Section 1.3 P 1 = 1 2. = 1 4 2 8. P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., = 1 2 4. Difference Equations to Differential Equations Section. The Sum of a Sequence This section considers the problem of adding together the terms of a sequence. Of course, this is a problem only if more than

More information

Chapter 9 Monté Carlo Simulation

Chapter 9 Monté Carlo Simulation MGS 3100 Business Analysis Chapter 9 Monté Carlo What Is? A model/process used to duplicate or mimic the real system Types of Models Physical simulation Computer simulation When to Use (Computer) Models?

More information

6.2. Discrete Probability Distributions

6.2. Discrete Probability Distributions 6.2. Discrete Probability Distributions Discrete Uniform distribution (diskreetti tasajakauma) A random variable X follows the dicrete uniform distribution on the interval [a, a+1,..., b], if it may attain

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

Probability for Computer Scientists

Probability for Computer Scientists Probability for Computer Scientists This material is provided for the educational use of students in CSE2400 at FIT. No further use or reproduction is permitted. Copyright G.A.Marin, 2008, All rights reserved.

More information

Tenth Problem Assignment

Tenth Problem Assignment EECS 40 Due on April 6, 007 PROBLEM (8 points) Dave is taking a multiple-choice exam. You may assume that the number of questions is infinite. Simultaneously, but independently, his conscious and subconscious

More information

ECON1003: Analysis of Economic Data Fall 2003 Answers to Quiz #2 11:40a.m. 12:25p.m. (45 minutes) Tuesday, October 28, 2003

ECON1003: Analysis of Economic Data Fall 2003 Answers to Quiz #2 11:40a.m. 12:25p.m. (45 minutes) Tuesday, October 28, 2003 ECON1003: Analysis of Economic Data Fall 2003 Answers to Quiz #2 11:40a.m. 12:25p.m. (45 minutes) Tuesday, October 28, 2003 1. (4 points) The number of claims for missing baggage for a well-known airline

More information

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is. Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,

More information

1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles...

1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles... MATH4427 Notebook 1 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 1 MATH4427 Notebook 1 3 1.1 Introduction, and Review of Probability

More information

Random Variables. Chapter 2. Random Variables 1

Random Variables. Chapter 2. Random Variables 1 Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets

More information

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS

More information

6 PROBABILITY GENERATING FUNCTIONS

6 PROBABILITY GENERATING FUNCTIONS 6 PROBABILITY GENERATING FUNCTIONS Certain derivations presented in this course have been somewhat heavy on algebra. For example, determining the expectation of the Binomial distribution (page 5.1 turned

More information

Opgaven Onderzoeksmethoden, Onderdeel Statistiek

Opgaven Onderzoeksmethoden, Onderdeel Statistiek Opgaven Onderzoeksmethoden, Onderdeel Statistiek 1. What is the measurement scale of the following variables? a Shoe size b Religion c Car brand d Score in a tennis game e Number of work hours per week

More information

Decision Making Under Uncertainty. Professor Peter Cramton Economics 300

Decision Making Under Uncertainty. Professor Peter Cramton Economics 300 Decision Making Under Uncertainty Professor Peter Cramton Economics 300 Uncertainty Consumers and firms are usually uncertain about the payoffs from their choices Example 1: A farmer chooses to cultivate

More information

MA 1125 Lecture 14 - Expected Values. Friday, February 28, 2014. Objectives: Introduce expected values.

MA 1125 Lecture 14 - Expected Values. Friday, February 28, 2014. Objectives: Introduce expected values. MA 5 Lecture 4 - Expected Values Friday, February 2, 24. Objectives: Introduce expected values.. Means, Variances, and Standard Deviations of Probability Distributions Two classes ago, we computed the

More information

arxiv:1112.0829v1 [math.pr] 5 Dec 2011

arxiv:1112.0829v1 [math.pr] 5 Dec 2011 How Not to Win a Million Dollars: A Counterexample to a Conjecture of L. Breiman Thomas P. Hayes arxiv:1112.0829v1 [math.pr] 5 Dec 2011 Abstract Consider a gambling game in which we are allowed to repeatedly

More information

Principle of Data Reduction

Principle of Data Reduction Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then

More information

SCHOOL OF ENGINEERING & BUILT ENVIRONMENT. Mathematics

SCHOOL OF ENGINEERING & BUILT ENVIRONMENT. Mathematics SCHOOL OF ENGINEERING & BUILT ENVIRONMENT Mathematics Probability and Probability Distributions 1. Introduction 2. Probability 3. Basic rules of probability 4. Complementary events 5. Addition Law for

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such

More information