ST 371 (IV): Discrete Random Variables
|
|
- Samantha Morton
- 7 years ago
- Views:
Transcription
1 ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible outcome of the experiment. We denote random variables by uppercase letters, often X, Y or Z. Examples for random variables (rv). Toss a coin. The sample space S = {H, T }. Define an rv X such that X({H}) = 1 and X({T }) = 0. X is called a Bernoulli random variable. Toss a coin until a head is observed. The sample space S = {H, T H, T T H, }. Define X = number of tosses needed until a head is observed. Then Roll a pair of dice. Define X({T H}) = 2, X({T T T T H}) = 5. X= sum of the numbers on the dice Y =the difference between the two numbers on the dice Z=the maximum of the two numbers on the dice Consider outcome ω = (2, 3). Then X(ω) = 5, Y (ω) = 1, Z(ω) = 3. Define Y = the height above sea level at the selected location in US. The largest possible value of Y is 14,494 and the smallest value of Y is 282. The sample space is S = {y : 282 y 14, 494}. 1
2 Discrete and continuous random variables. A random variable that can take on a finite or at most countably infinite number of values is said to be discrete (countably infinite means that the members in a set can be listed in an infinite sequence in which there is a first element, second element and so on). Examples include: the gender of a randomly selected student in class the total number of coin tosses required for observing two heads the number of students who are absent on the first day of class or the number of people arriving for treatment at an emergency room. A random variable that can take on values in an interval of real numbers is said to be continuous. Examples include: the depth at randomly chosen locations of a lake the amount of gas needed to drive to work on a given day the survival time of a cancer patient We will focus on discrete random variables in Chapter 3 and consider continuous random variables in Chapter 4. 2 Probability Mass Function Associated with each discrete random variable X is a probability mass function (pmf) that gives the probability that X equals x: p(x) = P ({X = x}) = P ({all s S : X(s) = x}). 2
3 Example 1 Consider whether the next customer buying a laptop at a university bookstore buys a Mac or a PC model. Let { 1 if a customer purchases a Mac X = 0 if a customer purchases a PC If 20% of all customers during that week select a Mac, what is the pmf of the rv X? Example 2 Suppose two fair dice are tossed. Let X be the random variable that is the sum of the two upturned faces. X is a discrete random variable since it has finitely many possible values (the 11 integers 2, 3,..., 12). The probability mass function of X is x p(x) It is often instructive to present the probability mass function in a graphical format plotting p(x i ) on the y-axis against x i on the x-axis. 3
4 Probability Mass Function X Remarks: So far, we have been defining probability functions in terms of the elementary outcomes making up an experiment s sample space. Thus, if two fair dice were tossed, a probability was assigned to each of the 36 possible pairs of upturned faces. We have seen that in certain situations some attribute of an outcome may hold more interest for the experimenter than the outcome itself. A craps player, for example, may be concerned only that he throws a 7, not whether the 7 was the result of a 5 and a 2, a 4 and a 3 or a 6 and a 1. That, being the case, it makes sense to replace the 36-member sample space S = {(i, j) : i = 1,, 6; j = 1,, 6} with the more relevant (and simpler) 11-member sample space of all possible two-dice sums, S = {x = i + j : i + j = 2, 3,, 12}. This redefinition of the sample space not only changes the number of outcomes in the space (from 36 to 11) but also changes the probability structure. In the original sample space, all 36 outcomes are equally likely. In the revised sample space, the 11 outcomes are not equally likely. 4
5 Example 3 Three balls are to be randomly selected without replacement from an urn containing balls numbered 1 through 20. Let X denote the largest number selected. X is a random variable taking on values 3, 4,..., 20. Since we select the balls randomly, each of the C 3,20 combinations of the balls is equally likely to be chosen. The probability mass function of X is P ({X = i}) = C 2,i 1 C 3,20, i = 3,, 20. This equation follows because the number of selections that result in the event {X = i} is just the number of selections that result in the ball numbered i and two of the balls numbered 1 through i 1 being chosen. Probability Mass Function X Suppose the random variable X can take on values {x 1, x 2, }. Since the probability mass function is a probability function on the redefined sample space that considers values of X, we have that P (X = x i ) = 1. i=1 5
6 This follows from 1 = P (S) = P ( {X = x i }) = i=1 P (X = x i ). i=1 Example 4 Independent trials, consisting of the flipping of a coin having probability p of coming up heads, are continually performed until a head occurs. Let X be the random variable that denotes the number of times the coin is flipped. The probability mass function for X is P {X = 1} = P {H} = p P {X = 2} = P {(T, H)} = (1 p)p P {X = 3} = P {(T, T, H)} = (1 p) 2 p P {X = n 1} = P {(T, T,..., T, H)} = (1 p) } {{ } n 2 p n 2 P {X = n} = P {(T, T,..., T, T )} = (1 p) } {{ } n 1 p n 1 3 Cumulative Distribution Function The cumulative distribution function (CDF) of a random variable X is the function F (x) = P (X x) = p(y). y:y x 6
7 Example 5 The pmf of a random variable X is given by x p(x) c What is c? What is the cdf of X? Calculate P (2 X 4). 7
8 All probability questions about X can be answered in terms of the cdf F. Specifically for discrete random variables, P (a < X b) = F (b) F (a) P (a X b) = F (b) F (a 1) for all a < b. This can be seen by writing the event {X b} as the union of the mutually exclusive events {X a} and {a < X b}. That is, {X b} = {X a} {a < X b}. Therefore, we have P {X b} = P {X a} + P {a < X b} and the result follows. Example 6 Consider selecting at random a student who is among the 15,000 registered for the current semester at NCSU. Let X=the number of courses for which the selected student is registered, and suppose that X has the following pmf: x p(x) What is the probability of a student chooses three or more courses? 8
9 4 Expected Value Probability mass functions provide a global overview of a random variable s behavior. Detail that explicit, though, is not always necessary - or even helpful. Often times, we want to focus the information contained in the pmf by summarizing certain of its features with single numbers. The first feature of a pmf that we will examine is central tendency, a term referring to the average value of a random variable. The most frequently used measure for describing central tendency is the expected value. Generally, for a discrete random variable, the expected value of a random variable X is a weighted average of the possible values X can take on, each value being weighted by the probability that X assumes it: E(X) = xp(x) x:p(x)>0 A simple fact: E(X + Y ) = E(X) + E(Y ). Example 7 Consider the experiment of rolling a die. Let X be the number on the face. Compute E(X). Consider rolling a pair of dice. Compute E(Y ). Let Y be the sum of the numbers. 9
10 Example 8 Consider Example 6. What is the average number of courses per student at NCSU? 5 Expectation of Function of a Random Variable Suppose we are given a discrete random variable X along with its pmf and that we want to compute the expected value of some function of X, say g(x). One approach is to directly determine the pmf of g(x). Example 9 Let X denote a random variable that takes on the values 1, 0, 1 with respective probabilities Compute E(X 2 ). P (X = 1) =.2, P (X = 0) =.5, P (X = 1) =.3 10
11 Although the procedure we used in the previous example will always enable us to compute the expected value of g(x) from knowledge of the pmf of X, there is another way of thinking about E[g(X)]. Noting that g(x) will equal g(x) whenever X is equal to x, it seems reasonable that should just be a weighted average of the values g(x) with g(x) being weighted by the probability that X is equal to x. Proposition 1 If X is a discrete random variable that takes on one of the values x i, i 1 with respective probabilities p(x i ), then for any real valued function g, E[g(X)] = i g(x i)p(x i ). Applying the proposition to Example 3, E(X 2 ) = ( 1) 2 (.2) (.5) (.3) =.5. Proof of Proposition 1. g(x i )p(x i ) = g(x i )p(x i ) i j i:g(x i )=y j = y j p(x i ) j i:g(x i )=y j = y j P {g(x) = y j } j = E[g(X)] Corollary 1 (The Rule of expected value.) If a and b are constants, then E(aX + b) = ae(x) + b. Proof of Corollary: E(aX + b) = x (ax + b) p(x) = a x x p(x) + b x p(x) = ae(x) + b. 11
12 Special cases of Corollary 1: E(aX) = ae(x). E(X + b) = E(X) + b. Example 10 A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a certain period at $200 apiece. Let X denote the number of computers sold, and suppose that P (X = 0) = 0.1, P (X = 1) = 0.2, P (X = 2) = 0.3 and P (X = 3) = 0.4. Let h(x) denote the profit associated with selling X units. What is the expected profit? 12
13 6 Variance Another useful summary of a random variable s pmf besides its central tendency is its spread. This is a very important concept in real life. For example, in the quality control of the lifetimes of a hard disk, we not only want the lifetime of a hard disk is long, but also want the lifetimes not to be too variable. Another example is in finance where investors not only want the investments with good returns (i.e., have a high expected value) but also want the investment not to be too risky (i.e., have a low spread). A commonly used measure of spread is the variance of a random variable, which is the expected squared deviation of the random variable from its expected value. Specifically, let X have pmf p(x) and expected value µ, then the variance of X, denoted by V (X), or just σ 2 X, is V (X) = E[(X µ) 2 ] = D (x µ) 2 p(x). The second equality holds by applying Proposition 1. Explanations and intuitions for variance: (X µ) 2 is the squared deviation of X from its mean The variance is the weighted average of squared deviations, where the weights are probabilities from the distribution. If most values of x is close to µ, then σ 2 would be relatively small. If most values of x is far away from µ, then σ 2 would be relatively large. Definition: the standard deviation (SD) of X is σ X = V (X) = σx 2. 13
14 Consider the following situations: The following three random variables have expected value 0 but very different spreads: X = 0 with probability 1 Y = 1 with probability of 0.5, 1 with probability 0.5. Z = 100 with probability 0.5, 100 with probability 0.5. Compare V (X), V (Y ) and V (Z). Suppose that the rate of return on stock A takes on the values of 30%, 10% and 10% with respective probabilities 0.25, 0.50 and 0.25 and on stock B the values of 50%, 10% and 30% with the same probabilities 0.25, 0.50 and Each stock then has the expected rate of return of 10%. Obviously stock A has less spread in its rate of return. Compare V (A) and V (B). 14
15 An alternative formula for variance. V (X) = E(X 2 ) [E(X)] 2. Proof. Let E(X) = µ. Then V (X) = E[(X µ) 2 ] = (x µ) 2 p(x) x = x (x 2 2µx + µ 2 )p(x) = x x 2 p(x) 2µ x xp(x) + µ 2 x p(x) = E(X 2 ) 2µ 2 + µ 2 = E(X 2 ) µ 2 = E(X 2 ) [E(X)] 2. The variance of a linear function. Let a, b be two constants, then V (ax + b) = a 2 V (X). Proof. Note that from Corollary 1, we have Let E(X) = µ. Then E(aX + b) = ae(x) + b. V (ax + b) = E[{(aX + b) E(aX + b)} 2 ] = E[(aX + b aµ b)] 2 = E[a 2 (X µ) 2 ] = a 2 [E(X µ) 2 ] = a 2 V (X) 15
16 Example 11 Let X denote the number of computers sold, and suppose that the pmf of X is P (X = 0) = 0.1, P (X = 1) = 0.2, P (X = 2) = 0.3, P (X = 3) = 0.4. The profit is a function of the number of computers sold: h(x) = 800X 900. What are the variance and SD of the profit h(x)? 16
Chapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,
More informationRandom variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.
Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()
More informationThe sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].
Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real
More informationChapter 5. Random variables
Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like
More informationSummary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)
Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume
More information5. Continuous Random Variables
5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be
More informationUniversity of California, Los Angeles Department of Statistics. Random variables
University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.
More informationRandom variables, probability distributions, binomial random variable
Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that
More informationFor a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )
Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll
More informationFeb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172-179)
Feb 7 Homework Solutions Math 151, Winter 2012 Chapter Problems (pages 172-179) Problem 3 Three dice are rolled. By assuming that each of the 6 3 216 possible outcomes is equally likely, find the probabilities
More information4.1 4.2 Probability Distribution for Discrete Random Variables
4.1 4.2 Probability Distribution for Discrete Random Variables Key concepts: discrete random variable, probability distribution, expected value, variance, and standard deviation of a discrete random variable.
More informationExample. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)
: Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest
More informationChapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution
Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover
More information3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.
3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately
More informationWHERE DOES THE 10% CONDITION COME FROM?
1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay
More informationAn Introduction to Basic Statistics and Probability
An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random
More informationDefinition: Suppose that two random variables, either continuous or discrete, X and Y have joint density
HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,
More informationStatistics 100A Homework 4 Solutions
Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation
More informationRandom Variables. Chapter 2. Random Variables 1
Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets
More informationMath/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability
Math/Stats 425 Introduction to Probability 1. Uncertainty and the axioms of probability Processes in the real world are random if outcomes cannot be predicted with certainty. Example: coin tossing, stock
More informationQuestion: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?
ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the
More informationStats on the TI 83 and TI 84 Calculator
Stats on the TI 83 and TI 84 Calculator Entering the sample values STAT button Left bracket { Right bracket } Store (STO) List L1 Comma Enter Example: Sample data are {5, 10, 15, 20} 1. Press 2 ND and
More informationYou flip a fair coin four times, what is the probability that you obtain three heads.
Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.
More informationProbability for Estimation (review)
Probability for Estimation (review) In general, we want to develop an estimator for systems of the form: x = f x, u + η(t); y = h x + ω(t); ggggg y, ffff x We will primarily focus on discrete time linear
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete
More informationE3: PROBABILITY AND STATISTICS lecture notes
E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................
More informationLecture 6: Discrete & Continuous Probability and Random Variables
Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September
More informationJoint Exam 1/P Sample Exam 1
Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationProbability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..
Probability Theory A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Florian Herzog 2013 Probability space Probability space A probability space W is a unique triple W = {Ω, F,
More informationProbability & Probability Distributions
Probability & Probability Distributions Carolyn J. Anderson EdPsych 580 Fall 2005 Probability & Probability Distributions p. 1/61 Probability & Probability Distributions Elementary Probability Theory Definitions
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected
More informationHomework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.
Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the
More information10.2 Series and Convergence
10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and
More information0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) =
. A mail-order computer business has si telephone lines. Let X denote the number of lines in use at a specified time. Suppose the pmf of X is as given in the accompanying table. 0 2 3 4 5 6 p(.0.5.20.25.20.06.04
More informationThe normal approximation to the binomial
The normal approximation to the binomial The binomial probability function is not useful for calculating probabilities when the number of trials n is large, as it involves multiplying a potentially very
More informationACMS 10140 Section 02 Elements of Statistics October 28, 2010. Midterm Examination II
ACMS 10140 Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Name DO NOT remove this answer page. DO turn in the entire exam. Make sure that you have all ten (10) pages of the examination
More informationMBA 611 STATISTICS AND QUANTITATIVE METHODS
MBA 611 STATISTICS AND QUANTITATIVE METHODS Part I. Review of Basic Statistics (Chapters 1-11) A. Introduction (Chapter 1) Uncertainty: Decisions are often based on incomplete information from uncertain
More informationLecture 5 : The Poisson Distribution
Lecture 5 : The Poisson Distribution Jonathan Marchini November 10, 2008 1 Introduction Many experimental situations occur in which we observe the counts of events within a set unit of time, area, volume,
More informationM2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung
M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS
More informationCovariance and Correlation
Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such
More informationSTAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE
STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about
More informationSection 5.1 Continuous Random Variables: Introduction
Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,
More informationNormal distribution. ) 2 /2σ. 2π σ
Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a
More informationCharacteristics of Binomial Distributions
Lesson2 Characteristics of Binomial Distributions In the last lesson, you constructed several binomial distributions, observed their shapes, and estimated their means and standard deviations. In Investigation
More informationIEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem
IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have
More informationMath 431 An Introduction to Probability. Final Exam Solutions
Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <
More informationSection 5 Part 2. Probability Distributions for Discrete Random Variables
Section 5 Part 2 Probability Distributions for Discrete Random Variables Review and Overview So far we ve covered the following probability and probability distribution topics Probability rules Probability
More informationSums of Independent Random Variables
Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables
More informationStatistics 100A Homework 3 Solutions
Chapter Statistics 00A Homework Solutions Ryan Rosario. Two balls are chosen randomly from an urn containing 8 white, black, and orange balls. Suppose that we win $ for each black ball selected and we
More informationBusiness Statistics 41000: Probability 1
Business Statistics 41000: Probability 1 Drew D. Creal University of Chicago, Booth School of Business Week 3: January 24 and 25, 2014 1 Class information Drew D. Creal Email: dcreal@chicagobooth.edu Office:
More informationMATH 140 Lab 4: Probability and the Standard Normal Distribution
MATH 140 Lab 4: Probability and the Standard Normal Distribution Problem 1. Flipping a Coin Problem In this problem, we want to simualte the process of flipping a fair coin 1000 times. Note that the outcomes
More informationFEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL
FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint
More informationExploratory Data Analysis
Exploratory Data Analysis Johannes Schauer johannes.schauer@tugraz.at Institute of Statistics Graz University of Technology Steyrergasse 17/IV, 8010 Graz www.statistics.tugraz.at February 12, 2008 Introduction
More informationProbability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom
Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of sample space, event and probability function. 2. Be able to
More informationLecture 8. Confidence intervals and the central limit theorem
Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of
More informationMath 461 Fall 2006 Test 2 Solutions
Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two
More information7: The CRR Market Model
Ben Goldys and Marek Rutkowski School of Mathematics and Statistics University of Sydney MATH3075/3975 Financial Mathematics Semester 2, 2015 Outline We will examine the following issues: 1 The Cox-Ross-Rubinstein
More informationChapter 5 Discrete Probability Distribution. Learning objectives
Chapter 5 Discrete Probability Distribution Slide 1 Learning objectives 1. Understand random variables and probability distributions. 1.1. Distinguish discrete and continuous random variables. 2. Able
More informationLecture 3: Continuous distributions, expected value & mean, variance, the normal distribution
Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution 8 October 2007 In this lecture we ll learn the following: 1. how continuous probability distributions differ
More informationNotes on Continuous Random Variables
Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes
More information4. Continuous Random Variables, the Pareto and Normal Distributions
4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random
More informationECE302 Spring 2006 HW4 Solutions February 6, 2006 1
ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in
More information6.3 Conditional Probability and Independence
222 CHAPTER 6. PROBABILITY 6.3 Conditional Probability and Independence Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted
More informationSample Questions for Mastery #5
Name: Class: Date: Sample Questions for Mastery #5 Multiple Choice Identify the choice that best completes the statement or answers the question.. For which of the following binomial experiments could
More informationIntroduction to the Practice of Statistics Fifth Edition Moore, McCabe Section 4.4 Homework
Introduction to the Practice of Statistics Fifth Edition Moore, McCabe Section 4.4 Homework 4.65 You buy a hot stock for $1000. The stock either gains 30% or loses 25% each day, each with probability.
More information1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles...
MATH4427 Notebook 1 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 1 MATH4427 Notebook 1 3 1.1 Introduction, and Review of Probability
More informationMATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables
MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables Tony Pourmohamad Department of Mathematics De Anza College Spring 2015 Objectives By the end of this set of slides,
More informationProbability and Statistics Vocabulary List (Definitions for Middle School Teachers)
Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) B Bar graph a diagram representing the frequency distribution for nominal or discrete data. It consists of a sequence
More informationChapter 4. Probability and Probability Distributions
Chapter 4. robability and robability Distributions Importance of Knowing robability To know whether a sample is not identical to the population from which it was selected, it is necessary to assess the
More informationBinomial lattice model for stock prices
Copyright c 2007 by Karl Sigman Binomial lattice model for stock prices Here we model the price of a stock in discrete time by a Markov chain of the recursive form S n+ S n Y n+, n 0, where the {Y i }
More informationManual for SOA Exam MLC.
Chapter 5. Life annuities. Extract from: Arcones Manual for the SOA Exam MLC. Spring 2010 Edition. available at http://www.actexmadriver.com/ 1/114 Whole life annuity A whole life annuity is a series of
More information2WB05 Simulation Lecture 8: Generating random variables
2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating
More informationSection 6.2 Definition of Probability
Section 6.2 Definition of Probability Probability is a measure of the likelihood that an event occurs. For example, if there is a 20% chance of rain tomorrow, that means that the probability that it will
More informationDepartment of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015.
Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment -3, Probability and Statistics, March 05. Due:-March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x
More informationMAS108 Probability I
1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper
More information6.4 Normal Distribution
Contents 6.4 Normal Distribution....................... 381 6.4.1 Characteristics of the Normal Distribution....... 381 6.4.2 The Standardized Normal Distribution......... 385 6.4.3 Meaning of Areas under
More informationPrinciple of Data Reduction
Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then
More informationChapter 5. Discrete Probability Distributions
Chapter 5. Discrete Probability Distributions Chapter Problem: Did Mendel s result from plant hybridization experiments contradicts his theory? 1. Mendel s theory says that when there are two inheritable
More informationNotes on Probability Theory
Notes on Probability Theory Christopher King Department of Mathematics Northeastern University July 31, 2009 Abstract These notes are intended to give a solid introduction to Probability Theory with a
More informationLab 11. Simulations. The Concept
Lab 11 Simulations In this lab you ll learn how to create simulations to provide approximate answers to probability questions. We ll make use of a particular kind of structure, called a box model, that
More informationDefinition and Calculus of Probability
In experiments with multivariate outcome variable, knowledge of the value of one variable may help predict another. For now, the word prediction will mean update the probabilities of events regarding the
More informationStatistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined
Expectation Statistics and Random Variables Math 425 Introduction to Probability Lecture 4 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan February 9, 2009 When a large
More informationWithout data, all you are is just another person with an opinion.
OCR Statistics Module Revision Sheet The S exam is hour 30 minutes long. You are allowed a graphics calculator. Before you go into the exam make sureyou are fully aware of the contents of theformula booklet
More informationWhat is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference
0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS Contents 1. Moment generating functions 2. Sum of a ranom number of ranom variables 3. Transforms
More informationProbability Generating Functions
page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence
More informationProbability Distributions
CHAPTER 5 Probability Distributions CHAPTER OUTLINE 5.1 Probability Distribution of a Discrete Random Variable 5.2 Mean and Standard Deviation of a Probability Distribution 5.3 The Binomial Distribution
More informationLecture 2 Binomial and Poisson Probability Distributions
Lecture 2 Binomial and Poisson Probability Distributions Binomial Probability Distribution l Consider a situation where there are only two possible outcomes (a Bernoulli trial) H Example: u flipping a
More informationACMS 10140 Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Answers
ACMS 10140 Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Answers Name DO NOT remove this answer page. DO turn in the entire exam. Make sure that you have all ten (10) pages
More informationLecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University
Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University 1 Chapter 1 Probability 1.1 Basic Concepts In the study of statistics, we consider experiments
More informationA review of the portions of probability useful for understanding experimental design and analysis.
Chapter 3 Review of Probability A review of the portions of probability useful for understanding experimental design and analysis. The material in this section is intended as a review of the topic of probability
More informationRANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. DISCRETE RANDOM VARIABLES.. Definition of a Discrete Random Variable. A random variable X is said to be discrete if it can assume only a finite or countable
More informationSome special discrete probability distributions
University of California, Los Angeles Department of Statistics Statistics 100A Instructor: Nicolas Christou Some special discrete probability distributions Bernoulli random variable: It is a variable that
More informationProbability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X
Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random
More information1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let
Copyright c 2009 by Karl Sigman 1 Stopping Times 1.1 Stopping Times: Definition Given a stochastic process X = {X n : n 0}, a random time τ is a discrete random variable on the same probability space as
More informationDescriptive Statistics
Y520 Robert S Michael Goal: Learn to calculate indicators and construct graphs that summarize and describe a large quantity of values. Using the textbook readings and other resources listed on the web
More informationSome probability and statistics
Appendix A Some probability and statistics A Probabilities, random variables and their distribution We summarize a few of the basic concepts of random variables, usually denoted by capital letters, X,Y,
More informationWald s Identity. by Jeffery Hein. Dartmouth College, Math 100
Wald s Identity by Jeffery Hein Dartmouth College, Math 100 1. Introduction Given random variables X 1, X 2, X 3,... with common finite mean and a stopping rule τ which may depend upon the given sequence,
More information2. Discrete random variables
2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be
More information