ST 371 (IV): Discrete Random Variables

Size: px
Start display at page:

Download "ST 371 (IV): Discrete Random Variables"

Transcription

1 ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible outcome of the experiment. We denote random variables by uppercase letters, often X, Y or Z. Examples for random variables (rv). Toss a coin. The sample space S = {H, T }. Define an rv X such that X({H}) = 1 and X({T }) = 0. X is called a Bernoulli random variable. Toss a coin until a head is observed. The sample space S = {H, T H, T T H, }. Define X = number of tosses needed until a head is observed. Then Roll a pair of dice. Define X({T H}) = 2, X({T T T T H}) = 5. X= sum of the numbers on the dice Y =the difference between the two numbers on the dice Z=the maximum of the two numbers on the dice Consider outcome ω = (2, 3). Then X(ω) = 5, Y (ω) = 1, Z(ω) = 3. Define Y = the height above sea level at the selected location in US. The largest possible value of Y is 14,494 and the smallest value of Y is 282. The sample space is S = {y : 282 y 14, 494}. 1

2 Discrete and continuous random variables. A random variable that can take on a finite or at most countably infinite number of values is said to be discrete (countably infinite means that the members in a set can be listed in an infinite sequence in which there is a first element, second element and so on). Examples include: the gender of a randomly selected student in class the total number of coin tosses required for observing two heads the number of students who are absent on the first day of class or the number of people arriving for treatment at an emergency room. A random variable that can take on values in an interval of real numbers is said to be continuous. Examples include: the depth at randomly chosen locations of a lake the amount of gas needed to drive to work on a given day the survival time of a cancer patient We will focus on discrete random variables in Chapter 3 and consider continuous random variables in Chapter 4. 2 Probability Mass Function Associated with each discrete random variable X is a probability mass function (pmf) that gives the probability that X equals x: p(x) = P ({X = x}) = P ({all s S : X(s) = x}). 2

3 Example 1 Consider whether the next customer buying a laptop at a university bookstore buys a Mac or a PC model. Let { 1 if a customer purchases a Mac X = 0 if a customer purchases a PC If 20% of all customers during that week select a Mac, what is the pmf of the rv X? Example 2 Suppose two fair dice are tossed. Let X be the random variable that is the sum of the two upturned faces. X is a discrete random variable since it has finitely many possible values (the 11 integers 2, 3,..., 12). The probability mass function of X is x p(x) It is often instructive to present the probability mass function in a graphical format plotting p(x i ) on the y-axis against x i on the x-axis. 3

4 Probability Mass Function X Remarks: So far, we have been defining probability functions in terms of the elementary outcomes making up an experiment s sample space. Thus, if two fair dice were tossed, a probability was assigned to each of the 36 possible pairs of upturned faces. We have seen that in certain situations some attribute of an outcome may hold more interest for the experimenter than the outcome itself. A craps player, for example, may be concerned only that he throws a 7, not whether the 7 was the result of a 5 and a 2, a 4 and a 3 or a 6 and a 1. That, being the case, it makes sense to replace the 36-member sample space S = {(i, j) : i = 1,, 6; j = 1,, 6} with the more relevant (and simpler) 11-member sample space of all possible two-dice sums, S = {x = i + j : i + j = 2, 3,, 12}. This redefinition of the sample space not only changes the number of outcomes in the space (from 36 to 11) but also changes the probability structure. In the original sample space, all 36 outcomes are equally likely. In the revised sample space, the 11 outcomes are not equally likely. 4

5 Example 3 Three balls are to be randomly selected without replacement from an urn containing balls numbered 1 through 20. Let X denote the largest number selected. X is a random variable taking on values 3, 4,..., 20. Since we select the balls randomly, each of the C 3,20 combinations of the balls is equally likely to be chosen. The probability mass function of X is P ({X = i}) = C 2,i 1 C 3,20, i = 3,, 20. This equation follows because the number of selections that result in the event {X = i} is just the number of selections that result in the ball numbered i and two of the balls numbered 1 through i 1 being chosen. Probability Mass Function X Suppose the random variable X can take on values {x 1, x 2, }. Since the probability mass function is a probability function on the redefined sample space that considers values of X, we have that P (X = x i ) = 1. i=1 5

6 This follows from 1 = P (S) = P ( {X = x i }) = i=1 P (X = x i ). i=1 Example 4 Independent trials, consisting of the flipping of a coin having probability p of coming up heads, are continually performed until a head occurs. Let X be the random variable that denotes the number of times the coin is flipped. The probability mass function for X is P {X = 1} = P {H} = p P {X = 2} = P {(T, H)} = (1 p)p P {X = 3} = P {(T, T, H)} = (1 p) 2 p P {X = n 1} = P {(T, T,..., T, H)} = (1 p) } {{ } n 2 p n 2 P {X = n} = P {(T, T,..., T, T )} = (1 p) } {{ } n 1 p n 1 3 Cumulative Distribution Function The cumulative distribution function (CDF) of a random variable X is the function F (x) = P (X x) = p(y). y:y x 6

7 Example 5 The pmf of a random variable X is given by x p(x) c What is c? What is the cdf of X? Calculate P (2 X 4). 7

8 All probability questions about X can be answered in terms of the cdf F. Specifically for discrete random variables, P (a < X b) = F (b) F (a) P (a X b) = F (b) F (a 1) for all a < b. This can be seen by writing the event {X b} as the union of the mutually exclusive events {X a} and {a < X b}. That is, {X b} = {X a} {a < X b}. Therefore, we have P {X b} = P {X a} + P {a < X b} and the result follows. Example 6 Consider selecting at random a student who is among the 15,000 registered for the current semester at NCSU. Let X=the number of courses for which the selected student is registered, and suppose that X has the following pmf: x p(x) What is the probability of a student chooses three or more courses? 8

9 4 Expected Value Probability mass functions provide a global overview of a random variable s behavior. Detail that explicit, though, is not always necessary - or even helpful. Often times, we want to focus the information contained in the pmf by summarizing certain of its features with single numbers. The first feature of a pmf that we will examine is central tendency, a term referring to the average value of a random variable. The most frequently used measure for describing central tendency is the expected value. Generally, for a discrete random variable, the expected value of a random variable X is a weighted average of the possible values X can take on, each value being weighted by the probability that X assumes it: E(X) = xp(x) x:p(x)>0 A simple fact: E(X + Y ) = E(X) + E(Y ). Example 7 Consider the experiment of rolling a die. Let X be the number on the face. Compute E(X). Consider rolling a pair of dice. Compute E(Y ). Let Y be the sum of the numbers. 9

10 Example 8 Consider Example 6. What is the average number of courses per student at NCSU? 5 Expectation of Function of a Random Variable Suppose we are given a discrete random variable X along with its pmf and that we want to compute the expected value of some function of X, say g(x). One approach is to directly determine the pmf of g(x). Example 9 Let X denote a random variable that takes on the values 1, 0, 1 with respective probabilities Compute E(X 2 ). P (X = 1) =.2, P (X = 0) =.5, P (X = 1) =.3 10

11 Although the procedure we used in the previous example will always enable us to compute the expected value of g(x) from knowledge of the pmf of X, there is another way of thinking about E[g(X)]. Noting that g(x) will equal g(x) whenever X is equal to x, it seems reasonable that should just be a weighted average of the values g(x) with g(x) being weighted by the probability that X is equal to x. Proposition 1 If X is a discrete random variable that takes on one of the values x i, i 1 with respective probabilities p(x i ), then for any real valued function g, E[g(X)] = i g(x i)p(x i ). Applying the proposition to Example 3, E(X 2 ) = ( 1) 2 (.2) (.5) (.3) =.5. Proof of Proposition 1. g(x i )p(x i ) = g(x i )p(x i ) i j i:g(x i )=y j = y j p(x i ) j i:g(x i )=y j = y j P {g(x) = y j } j = E[g(X)] Corollary 1 (The Rule of expected value.) If a and b are constants, then E(aX + b) = ae(x) + b. Proof of Corollary: E(aX + b) = x (ax + b) p(x) = a x x p(x) + b x p(x) = ae(x) + b. 11

12 Special cases of Corollary 1: E(aX) = ae(x). E(X + b) = E(X) + b. Example 10 A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a certain period at $200 apiece. Let X denote the number of computers sold, and suppose that P (X = 0) = 0.1, P (X = 1) = 0.2, P (X = 2) = 0.3 and P (X = 3) = 0.4. Let h(x) denote the profit associated with selling X units. What is the expected profit? 12

13 6 Variance Another useful summary of a random variable s pmf besides its central tendency is its spread. This is a very important concept in real life. For example, in the quality control of the lifetimes of a hard disk, we not only want the lifetime of a hard disk is long, but also want the lifetimes not to be too variable. Another example is in finance where investors not only want the investments with good returns (i.e., have a high expected value) but also want the investment not to be too risky (i.e., have a low spread). A commonly used measure of spread is the variance of a random variable, which is the expected squared deviation of the random variable from its expected value. Specifically, let X have pmf p(x) and expected value µ, then the variance of X, denoted by V (X), or just σ 2 X, is V (X) = E[(X µ) 2 ] = D (x µ) 2 p(x). The second equality holds by applying Proposition 1. Explanations and intuitions for variance: (X µ) 2 is the squared deviation of X from its mean The variance is the weighted average of squared deviations, where the weights are probabilities from the distribution. If most values of x is close to µ, then σ 2 would be relatively small. If most values of x is far away from µ, then σ 2 would be relatively large. Definition: the standard deviation (SD) of X is σ X = V (X) = σx 2. 13

14 Consider the following situations: The following three random variables have expected value 0 but very different spreads: X = 0 with probability 1 Y = 1 with probability of 0.5, 1 with probability 0.5. Z = 100 with probability 0.5, 100 with probability 0.5. Compare V (X), V (Y ) and V (Z). Suppose that the rate of return on stock A takes on the values of 30%, 10% and 10% with respective probabilities 0.25, 0.50 and 0.25 and on stock B the values of 50%, 10% and 30% with the same probabilities 0.25, 0.50 and Each stock then has the expected rate of return of 10%. Obviously stock A has less spread in its rate of return. Compare V (A) and V (B). 14

15 An alternative formula for variance. V (X) = E(X 2 ) [E(X)] 2. Proof. Let E(X) = µ. Then V (X) = E[(X µ) 2 ] = (x µ) 2 p(x) x = x (x 2 2µx + µ 2 )p(x) = x x 2 p(x) 2µ x xp(x) + µ 2 x p(x) = E(X 2 ) 2µ 2 + µ 2 = E(X 2 ) µ 2 = E(X 2 ) [E(X)] 2. The variance of a linear function. Let a, b be two constants, then V (ax + b) = a 2 V (X). Proof. Note that from Corollary 1, we have Let E(X) = µ. Then E(aX + b) = ae(x) + b. V (ax + b) = E[{(aX + b) E(aX + b)} 2 ] = E[(aX + b aµ b)] 2 = E[a 2 (X µ) 2 ] = a 2 [E(X µ) 2 ] = a 2 V (X) 15

16 Example 11 Let X denote the number of computers sold, and suppose that the pmf of X is P (X = 0) = 0.1, P (X = 1) = 0.2, P (X = 2) = 0.3, P (X = 3) = 0.4. The profit is a function of the number of computers sold: h(x) = 800X 900. What are the variance and SD of the profit h(x)? 16

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

University of California, Los Angeles Department of Statistics. Random variables

University of California, Los Angeles Department of Statistics. Random variables University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.

More information

Chapter 5. Random variables

Chapter 5. Random variables Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

4.1 4.2 Probability Distribution for Discrete Random Variables

4.1 4.2 Probability Distribution for Discrete Random Variables 4.1 4.2 Probability Distribution for Discrete Random Variables Key concepts: discrete random variable, probability distribution, expected value, variance, and standard deviation of a discrete random variable.

More information

Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 3-7, 3-8 The remaining discrete random

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away) : Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest

More information

6. Jointly Distributed Random Variables

6. Jointly Distributed Random Variables 6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

Probability distributions

Probability distributions Probability distributions (Notes are heavily adapted from Harnett, Ch. 3; Hayes, sections 2.14-2.19; see also Hayes, Appendix B.) I. Random variables (in general) A. So far we have focused on single events,

More information

Feb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179)

Feb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179) Feb 7 Homework Solutions Math 151, Winter 2012 Chapter Problems (pages 172-179) Problem 3 Three dice are rolled. By assuming that each of the 6 3 216 possible outcomes is equally likely, find the probabilities

More information

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions... MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 2004-2012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................

More information

Random Variables, Expectation, Distributions

Random Variables, Expectation, Distributions Random Variables, Expectation, Distributions CS 5960/6960: Nonparametric Methods Tom Fletcher January 21, 2009 Review Random Variables Definition A random variable is a function defined on a probability

More information

Random Variables. Chapter 2. Random Variables 1

Random Variables. Chapter 2. Random Variables 1 Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets

More information

WHERE DOES THE 10% CONDITION COME FROM?

WHERE DOES THE 10% CONDITION COME FROM? 1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

ST 371 (VIII): Theory of Joint Distributions

ST 371 (VIII): Theory of Joint Distributions ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or

More information

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability Math/Stats 425 Introduction to Probability 1. Uncertainty and the axioms of probability Processes in the real world are random if outcomes cannot be predicted with certainty. Example: coin tossing, stock

More information

Contents. TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics. Yuming Jiang. Basic Concepts Random Variables

Contents. TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics. Yuming Jiang. Basic Concepts Random Variables TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics Yuming Jiang 1 Some figures taken from the web. Contents Basic Concepts Random Variables Discrete Random Variables Continuous Random

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

Example. Two fair dice are tossed and the two outcomes recorded. As is familiar, we have

Example. Two fair dice are tossed and the two outcomes recorded. As is familiar, we have Lectures 9-10 jacques@ucsd.edu 5.1 Random Variables Let (Ω, F, P ) be a probability space. The Borel sets in R are the sets in the smallest σ- field on R that contains all countable unions and complements

More information

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved. 3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately

More information

Probability for Estimation (review)

Probability for Estimation (review) Probability for Estimation (review) In general, we want to develop an estimator for systems of the form: x = f x, u + η(t); y = h x + ω(t); ggggg y, ffff x We will primarily focus on discrete time linear

More information

Expectations. Expectations. (See also Hays, Appendix B; Harnett, ch. 3).

Expectations. Expectations. (See also Hays, Appendix B; Harnett, ch. 3). Expectations Expectations. (See also Hays, Appendix B; Harnett, ch. 3). A. The expected value of a random variable is the arithmetic mean of that variable, i.e. E() = µ. As Hays notes, the idea of the

More information

Toss a coin twice. Let Y denote the number of heads.

Toss a coin twice. Let Y denote the number of heads. ! Let S be a discrete sample space with the set of elementary events denoted by E = {e i, i = 1, 2, 3 }. A random variable is a function Y(e i ) that assigns a real value to each elementary event, e i.

More information

You flip a fair coin four times, what is the probability that you obtain three heads.

You flip a fair coin four times, what is the probability that you obtain three heads. Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage

Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage 4 Continuous Random Variables and Probability Distributions Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage Continuous r.v. A random variable X is continuous if possible values

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

Examples of infinite sample spaces. Math 425 Introduction to Probability Lecture 12. Example of coin tosses. Axiom 3 Strong form

Examples of infinite sample spaces. Math 425 Introduction to Probability Lecture 12. Example of coin tosses. Axiom 3 Strong form Infinite Discrete Sample Spaces s of infinite sample spaces Math 425 Introduction to Probability Lecture 2 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan February 4,

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology n (i) m m (ii) n m ( (iii) n n n n (iv) m m Massachusetts Institute of Technology 6.0/6.: Probabilistic Systems Analysis (Quiz Solutions Spring 009) Question Multiple Choice Questions: CLEARLY circle the

More information

Topic 4: Multivariate random variables. Multiple random variables

Topic 4: Multivariate random variables. Multiple random variables Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly

More information

Probability and Statistics

Probability and Statistics CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b - 0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute - Systems and Modeling GIGA - Bioinformatics ULg kristel.vansteen@ulg.ac.be

More information

Stats on the TI 83 and TI 84 Calculator

Stats on the TI 83 and TI 84 Calculator Stats on the TI 83 and TI 84 Calculator Entering the sample values STAT button Left bracket { Right bracket } Store (STO) List L1 Comma Enter Example: Sample data are {5, 10, 15, 20} 1. Press 2 ND and

More information

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit? ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

More information

Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

More information

Probabilities and Random Variables

Probabilities and Random Variables Probabilities and Random Variables This is an elementary overview of the basic concepts of probability theory. 1 The Probability Space The purpose of probability theory is to model random experiments so

More information

Probability & Probability Distributions

Probability & Probability Distributions Probability & Probability Distributions Carolyn J. Anderson EdPsych 580 Fall 2005 Probability & Probability Distributions p. 1/61 Probability & Probability Distributions Elementary Probability Theory Definitions

More information

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Probability Theory A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Florian Herzog 2013 Probability space Probability space A probability space W is a unique triple W = {Ω, F,

More information

MATHEMATICS FOR ENGINEERS STATISTICS TUTORIAL 4 PROBABILITY DISTRIBUTIONS

MATHEMATICS FOR ENGINEERS STATISTICS TUTORIAL 4 PROBABILITY DISTRIBUTIONS MATHEMATICS FOR ENGINEERS STATISTICS TUTORIAL 4 PROBABILITY DISTRIBUTIONS CONTENTS Sample Space Accumulative Probability Probability Distributions Binomial Distribution Normal Distribution Poisson Distribution

More information

RANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013. 3 k) ( 52 3 )

RANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013. 3 k) ( 52 3 ) RANDOM VARIABLES MATH CIRCLE (ADVANCED) //0 0) a) Suppose you flip a fair coin times. i) What is the probability you get 0 heads? ii) head? iii) heads? iv) heads? For = 0,,,, P ( Heads) = ( ) b) Suppose

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

Exercises with solutions (1)

Exercises with solutions (1) Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Continuous Random Variables 2 11 Introduction 2 12 Probability Density Functions 3 13 Transformations 5 2 Mean, Variance and Quantiles

More information

4. Joint Distributions

4. Joint Distributions Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose

More information

Lecture 16: Expected value, variance, independence and Chebyshev inequality

Lecture 16: Expected value, variance, independence and Chebyshev inequality Lecture 16: Expected value, variance, independence and Chebyshev inequality Expected value, variance, and Chebyshev inequality. If X is a random variable recall that the expected value of X, E[X] is the

More information

Definition of Random Variable A random variable is a function from a sample space S into the real numbers.

Definition of Random Variable A random variable is a function from a sample space S into the real numbers. .4 Random Variable Motivation example In an opinion poll, we might decide to ask 50 people whether they agree or disagree with a certain issue. If we record a for agree and 0 for disagree, the sample space

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

More information

Topic 8: The Expected Value

Topic 8: The Expected Value Topic 8: September 27 and 29, 2 Among the simplest summary of quantitative data is the sample mean. Given a random variable, the corresponding concept is given a variety of names, the distributional mean,

More information

4. Joint Distributions of Two Random Variables

4. Joint Distributions of Two Random Variables 4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) =

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) = . A mail-order computer business has si telephone lines. Let X denote the number of lines in use at a specified time. Suppose the pmf of X is as given in the accompanying table. 0 2 3 4 5 6 p(.0.5.20.25.20.06.04

More information

Notes 11 Autumn 2005

Notes 11 Autumn 2005 MAS 08 Probabilit I Notes Autumn 005 Two discrete random variables If X and Y are discrete random variables defined on the same sample space, then events such as X = and Y = are well defined. The joint

More information

ACMS 10140 Section 02 Elements of Statistics October 28, 2010. Midterm Examination II

ACMS 10140 Section 02 Elements of Statistics October 28, 2010. Midterm Examination II ACMS 10140 Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Name DO NOT remove this answer page. DO turn in the entire exam. Make sure that you have all ten (10) pages of the examination

More information

Topic 8 The Expected Value

Topic 8 The Expected Value Topic 8 The Expected Value Functions of Random Variables 1 / 12 Outline Names for Eg(X ) Variance and Standard Deviation Independence Covariance and Correlation 2 / 12 Names for Eg(X ) If g(x) = x, then

More information

Statistics 100A Homework 3 Solutions

Statistics 100A Homework 3 Solutions Chapter Statistics 00A Homework Solutions Ryan Rosario. Two balls are chosen randomly from an urn containing 8 white, black, and orange balls. Suppose that we win $ for each black ball selected and we

More information

Discrete probability and the laws of chance

Discrete probability and the laws of chance Chapter 8 Discrete probability and the laws of chance 8.1 Introduction In this chapter we lay the groundwork for calculations and rules governing simple discrete probabilities. These steps will be essential

More information

PROBABILITY NOTIONS. Summary. 1. Random experiment

PROBABILITY NOTIONS. Summary. 1. Random experiment PROBABILITY NOTIONS Summary 1. Random experiment... 1 2. Sample space... 2 3. Event... 2 4. Probability calculation... 3 4.1. Fundamental sample space... 3 4.2. Calculation of probability... 3 4.3. Non

More information

Conditional expectation

Conditional expectation A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined

More information

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

Joint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage 5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single

More information

Characteristics of Binomial Distributions

Characteristics of Binomial Distributions Lesson2 Characteristics of Binomial Distributions In the last lesson, you constructed several binomial distributions, observed their shapes, and estimated their means and standard deviations. In Investigation

More information

Random Variable: A function that assigns numerical values to all the outcomes in the sample space.

Random Variable: A function that assigns numerical values to all the outcomes in the sample space. STAT 509 Section 3.2: Discrete Random Variables Random Variable: A function that assigns numerical values to all the outcomes in the sample space. Notation: Capital letters (like Y) denote a random variable.

More information

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )

P (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i ) Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =

More information

Worked examples Basic Concepts of Probability Theory

Worked examples Basic Concepts of Probability Theory Worked examples Basic Concepts of Probability Theory Example 1 A regular tetrahedron is a body that has four faces and, if is tossed, the probability that it lands on any face is 1/4. Suppose that one

More information

Discrete and Continuous Random Variables. Summer 2003

Discrete and Continuous Random Variables. Summer 2003 Discrete and Continuous Random Variables Summer 003 Random Variables A random variable is a rule that assigns a numerical value to each possible outcome of a probabilistic experiment. We denote a random

More information

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about

More information

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall

More information

Section 5 Part 2. Probability Distributions for Discrete Random Variables

Section 5 Part 2. Probability Distributions for Discrete Random Variables Section 5 Part 2 Probability Distributions for Discrete Random Variables Review and Overview So far we ve covered the following probability and probability distribution topics Probability rules Probability

More information

Jan 17 Homework Solutions Math 151, Winter 2012. Chapter 2 Problems (pages 50-54)

Jan 17 Homework Solutions Math 151, Winter 2012. Chapter 2 Problems (pages 50-54) Jan 17 Homework Solutions Math 11, Winter 01 Chapter Problems (pages 0- Problem In an experiment, a die is rolled continually until a 6 appears, at which point the experiment stops. What is the sample

More information

Exploratory Data Analysis

Exploratory Data Analysis Exploratory Data Analysis Johannes Schauer johannes.schauer@tugraz.at Institute of Statistics Graz University of Technology Steyrergasse 17/IV, 8010 Graz www.statistics.tugraz.at February 12, 2008 Introduction

More information

The basics of probability theory. Distribution of variables, some important distributions

The basics of probability theory. Distribution of variables, some important distributions The basics of probability theory. Distribution of variables, some important distributions 1 Random experiment The outcome is not determined uniquely by the considered conditions. For example, tossing a

More information

Finite and discrete probability distributions

Finite and discrete probability distributions 8 Finite and discrete probability distributions To understand the algorithmic aspects of number theory and algebra, and applications such as cryptography, a firm grasp of the basics of probability theory

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Math 150 Sample Exam #2

Math 150 Sample Exam #2 Problem 1. (16 points) TRUE or FALSE. a. 3 die are rolled, there are 1 possible outcomes. b. If two events are complementary, then they are mutually exclusive events. c. If A and B are two independent

More information

3. Continuous Random Variables

3. Continuous Random Variables 3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such

More information

10.2 Series and Convergence

10.2 Series and Convergence 10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and

More information

Lecture 8. Confidence intervals and the central limit theorem

Lecture 8. Confidence intervals and the central limit theorem Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of

More information

Lahore University of Management Sciences

Lahore University of Management Sciences Lahore University of Management Sciences CMPE 501: Applied Probability (Fall 2010) Homework 3: Solution 1. A candy factory has an endless supply of red, orange, yellow, green, blue and violet jelly beans.

More information

Introduction to the Practice of Statistics Fifth Edition Moore, McCabe Section 4.4 Homework

Introduction to the Practice of Statistics Fifth Edition Moore, McCabe Section 4.4 Homework Introduction to the Practice of Statistics Fifth Edition Moore, McCabe Section 4.4 Homework 4.65 You buy a hot stock for $1000. The stock either gains 30% or loses 25% each day, each with probability.

More information

A (random) experiment is an activity with observable results. The sample space S of an experiment is the set of all outcomes.

A (random) experiment is an activity with observable results. The sample space S of an experiment is the set of all outcomes. Chapter 7 Probability 7.1 Experiments, Sample Spaces, and Events A (random) experiment is an activity with observable results. The sample space S of an experiment is the set of all outcomes. Each outcome

More information

The Big 50 Revision Guidelines for S1

The Big 50 Revision Guidelines for S1 The Big 50 Revision Guidelines for S1 If you can understand all of these you ll do very well 1. Know what is meant by a statistical model and the Modelling cycle of continuous refinement 2. Understand

More information

Sums of Independent Random Variables

Sums of Independent Random Variables Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

More information

Lecture 1 Probability review

Lecture 1 Probability review Lecture : Probability review of 9 Course: M32K Intro to Stochastic Processes Term: Fall 20 Instructor: Gordan Zitkovic Lecture Probability review RANDOM VARIABLES A large chunk of probability is about

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................

More information

EE 302 Division 1. Homework 5 Solutions.

EE 302 Division 1. Homework 5 Solutions. EE 32 Division. Homework 5 Solutions. Problem. A fair four-sided die (with faces labeled,, 2, 3) is thrown once to determine how many times a fair coin is to be flipped: if N is the number that results

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211) An-Najah National University Faculty of Engineering Industrial Engineering Department Course : Quantitative Methods (65211) Instructor: Eng. Tamer Haddad 2 nd Semester 2009/2010 Chapter 5 Example: Joint

More information

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of sample space, event and probability function. 2. Be able to

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 5. Life annuities. Extract from: Arcones Manual for the SOA Exam MLC. Spring 2010 Edition. available at http://www.actexmadriver.com/ 1/114 Whole life annuity A whole life annuity is a series of

More information