PROPERTIES OF PROBABILITY P (A B) = 0. P (A B) = P (A B)P (B).
|
|
- Kristina Stewart
- 7 years ago
- Views:
Transcription
1 PROPERTIES OF PROBABILITY S is the sample space A, B are arbitrary events, A is the complement of A Proposition: For any event A P (A ) = 1 P (A). Proposition: If A and B are mutually exclusive, that is, A B =, then P (A B) = 0. Proposition: For any two events A and B, P (A B) = P (A) + P (B) P (A B) Definition: For any two events A and B with P (B) > 0 the A given that B has occured is defined by conditional probability of P (A B) = P (A B) P (B) Multiplication Rule: P (A B) = P (A B)P (B). The Law of Total Probability: Let A 1, A 2,..., A n be mutually exclusive and exhaustive events. Then for any other event B n P (B) = P (B A 1 )P (A 1 ) P (B A n )P (A n ) = P (B A i )P (A i ) i=1 Definition: Two events A and B are independent if P (A B) = P (A) and are dependent otherwise. Proposition: A and B are independent if and only if P (A B) = P (A)P (B) 1
2 DISCRETE RANDOM VARIABLES Definition For a given sample space S of some experiment a random variable (rv) is any rule that associates a number with each outcomes in S. Definition:. Any random variable whose only possible values are 0 and 1 is called Bernoulli random variable. Example: Consider the coin tossing game. Then S = {H, T}. Let X be a random variable equal to 0 if the outcome is T, and equal to 1 if the outcome is H. Definition: A random variable is said to be discrete if its set of possible values is a discrete set, i.e. either if it consists of finite number of elements or if its elements can be listed in a sequence as x 1, x 2,..., x n,.... Definition: A probability mass function (pmf) of a discrete rv is defined for every real number x as p(x) = P (X = x). Remark: Notice that for every possible value x of the random variable, the pmf specifies the probability of observing that value when the experiment is performed. The conditions are required for any pmf. p(x) 0 and p(x) = 1 x Definition: The cumulative distribution function (cdf) F (x) of a discrete rv X with pmf p(x) is defined for every number x by F (x) = P (X x) = y:y x p(y). For any number x, F (x) is the probability that the observed value of X will be at most x. Definition: Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X) or µ X is E(X) = µ X = x D xp(x). Proposition: Rules of Expected Value for any constant a and b Consider the random variables X and Y then E(aX + b) = a E(X) + b, E(aX + by ) = a E(X) + b E(Y ). 2
3 Definition: Let X have a pmf p(x) and expected value µ. Then the variance of X, denoted by Var(X) or σx, 2 is Var(X) = (x µ) 2 p(x) = E[(X µ) 2 ]. D The standard deviation (SD) of X is σ x = σ 2 X. Proposition: [ ] Var(X) = σx 2 = x 2 p(x) µ 2 = E(X 2 ) (E(X)) 2 D Proposition: Rules of Variance for any constants a and b Consider two independent random variables X, Y, then Var(a X + b) = a 2 σx, 2 σ ax+b = a σ X, Var(a X + b Y ) = a 2 VarX + b 2 VarY, σ a X+b Y = a σ X + b σ Y. 3
4 CONTINUOUS RANDOM VARIABLES Definition: A random variable X is said to be continuous if its set of possible values is an entire interval of numbers that is, for some A B, any number x between A and B is possible value of X. Definition: A probability density function (pdf) of a continuous rv X is a function f(x) such that for any two numbers a and b with a b, P (a X b) = b a f(x)dx Remark: Notice that, the above definition means that the probability that X takes on a value in the interval [a, b] is the area under the graph of the density function f(x). Proposition: For f(x) to be a legitimate pdf, it must satisfy the following two conditions: 1. f(x) 0 for all x 2. + f(x)dx = 1, that is, the area under the entire graph of f(x) is equal to 1. Proposition: If X is a continuous rv, then for any number c, P (X = c) = 0. Definition: The cumulative distribution function (cdf) F (x) of a continuous rv X with pdf f(x) is defined for every number x by F (x) = P (X x) = x f(y)dy. Remark: For each x, F (x) is the area under the density curve to the left of x. It means that F (x) is increases (from 0 to 1) as x increases. Definition: The expected value or mean value of a continuous rv X with pdf f(x) is E(X) = µ X = + x f(x) dx. Proposition: If X is a continuous rv with pdf f(x) and h(x) is any function of X, then E [h(x)] = µ h(x) = + h(x) f(x) dx. 4
5 Definition: The variance of a continuous rv X with pdf f(x) and expected value µ is σ 2 X = Var(X) = + (x µ) 2 f(x)dx = E [ (X µ) 2]. The standard deviation (SD) of X is σ X = σ 2 X. Proposition: [ ] Var(X) = σx 2 = x 2 p(x) µ 2 = E(X 2 ) (E(X)) 2 D 5
6 IMPORTANT DISCRETE RANDOM VARIABLES 1. Binomial Random Variable Definition: An experiment for which the following 4 conditions hold is called binomial experiment. 1. The experiment consists of a sequence of n trials, where n is fixed in advance of the experiment. 2. The trials are identical, and each trial can result in one of the same possible outcomes, which we denote by success (S) or failure (F). 3. The trials are independent, so that the outcome on any particular trial does not influence the outcome on any other trial. 4. The probability of success is constant from trial to trial; we denote this probability by p. Definition: Given a binomial experiment consisting of n trials, the binomial random variable X associated with this experiment is defined as X = the number of S s among the n trials. Remark: A binomial random variable X has two parameters n and p. We will use the notation X B(n, p). Theorem: Let X B(n, p), that is, X is a binomial rv with parameters n and p. Then the pmf of X is: { ( ) n f(x) = P (X = x) = x p x (1 p) n x if x = 0, 1,..., n, 0 otherwise. Theorem: Let X B(n, p), then E(X) = µ X = np and Var(X) = σ 2 X = np(1 p) 2. Poisson Random Variable Definition: A random variable X is said to have a Poisson distribution with parameter λ, that is X P(λ), if the pmf of X is for some λ 0. P (X = x) = e λ λx, x = 0, 1, 2,... x! The value of λ is frequently a rate per unit time or unit area of occurrence of a certain event and X denotes the number of occurrence of this event during the unit time or area. 6
7 The Poisson probability model assumes that 1. the events occur independently, 2. the probability that an event occurs does not change in time, 3. the probability that an event will occur in an interval is proportional to the length of the interval, 4. the probability of more than one event occurring at the same time is vanishingly small. Proposition: If X has Poisson distribution with parameter λ, then E(X) = Var(X) = λ. Proposition: Suppose that we have a sequence of binomial rv-s B(n, p)-s, and we let n and p 0 in such a way that np remains fixed at value λ > 0. Then B(n, p) P(λ). Remark: According to this proposition, in any binomial experiment in which n is large and p is small, B(n, p) P(λ), where λ = np. As a rule of thumb, this approximation can safely be applied if n 100, p.01 and np Geometric Random Variable A geometric rv and distribution are based on an experiment satisfying the following conditions: 1. The experiment consists of a sequence of independent trials. 2. Each trial can result either success (S) or failure (F). 3. The probability of success is constant from trial to trial, so P ( S on triali) = p for i = 1, 2, 3, the experiment continues (trials are performed) until the first success have been observed. Proposition: The pmf of a negative binomial rv X with parameter p = P (S) = p is P (X = x) = (1 p) x 1 p x = 1, 2,... Proposition: If X is a geometric rv with parameter p then E(X) = 1 p Var(X) = 1 p p 2. 7
8 3. Hypergeometric Random Variable The assumptions leading to the hypergeometric distribution are as follows: 1. The population or set to be sampled consists of N individuals, objects, or elements (a finite population). 2. Each individual can be characterized as a success (S) or a failure (F), and there are M successes in the population. 3. A sample of n individuals is drawn in such a way that each subset of size n is equally likely to be chosen. The random variable of interest is X = the number of S s in the sample. The probability distribution of X depends on the parameters n, M, and N. Example: Suppose that a sample of size n is to be chosen randomly (without replacement) from an urn containing N balls, of which M are white and N M are black. If X denotes the number of white balls selected, then X has hypergeometric distribution, with parameters n, M, and N. Proposition: The pmf of a hypergeometric random variable X, with parameters n, M, and N is given by ) P (X = x) = ( )( M N M x n x ( N n) for x an integer satisfying max(0, n N + M) x min(n, M). 8
9 IMPORTANT CONTINUOUS RANDOM VARIABLES 1. Normal Distribution Definition: A continuous rv X is said to have a normal distribution with parameters µ and σ 2, where < µ < + and 0 < σ, if the pdf of X is f(x, µ, σ) = 1 σ 2π e(x µ)2 /(2σ2 ) < x < + Remark: The statement that X is normally distributed with parameters µ and σ 2 is abbreviated by X N (µ, σ 2 ). Definition: The normal distribution with parameter values µ = 0 and σ = 1 is called standard normal distribution and the random variable that has this standard normal distribution is called standard normal random variable and will be denoted by Z. The pdf of Z is f(z, 0, 1) = ϕ(z) = 1 2π e z2 /2 < z < + The cdf of Z is Φ(z) = P (Z z) = z ϕ(y)dy. Notation: z α will denote the value on the measurement axis for which α of the area under the z curve lies to the right of z α, that is P (Z z α ) = α. Proposition: If X N (µ, σ 2 ), then is a standard normal rv. Z = X µ σ Empirical Rule: If the population distribution of a variable is (approximately) normal, then 1. Roughly 68% of the values are within 1SD (standard deviation) of the mean. 2. Roughly 95% of the values are within 2SD of the mean. 3. Roughly 99% of the values are within 3SD of the mean. 9
10 Proposition: Let X be a binomial rv based on n trials with success probability p. Then if the binomial probability histogram is not too skewed, X has approximately a normal distribution with µ = np and σ = npq. In particular for a possible value k of X P (X = k) = Φ ( ) k +.5 np. npq In practice, the approximation is adequate provided that both np 5 and nq Lognormal Distribution Definition: A nonnegative rv X is said to have a lognormal distribution if the rv Y = ln(x) has a normal distribution. The resulting pdf of a lognormal rv, when ln(x) N (µ, σ 2 ) is f(x; µ, σ) = { 1 xσ /(2σ 2 ) 2π e[ln(x) µ]2 if x 0 0 otherwise. Remark: Be careful here; µ and σ are not the mean and standard deviation of X but of ln(x). Proposition: The mean and variance of X can be shown to be E(X) = e µ+σ2 /2, Var(X) = e 2µ+σ2 ( e σ2 1 ). Because ln(x) has a normal distribution the cdf of X can be expressed in terms of the cdf Φ(z) of a standard normal rv Z. For x 0, ( F (x; µ, σ) = P (X x) = P (ln(x) ln(x)) = P Z ln(x) µ ) ( ) ln(x) µ = Φ σ σ Remark: Suppose that X 1 and X 2 are independent rv s from lognormal distribution with the same parameters. Let Y 1 = ln X 1 and Y 2 = ln X 2 Then ( ) ( ) ) Y1 + Y 2 ln X1 + ln X 2 E = E = E ( X 1 X In general, if we have X 1, X 2,..., X n independent lognormals with the same parameters and Y i = ln X i, i = 1,..., n then ( ) Y Y n E n = n n X i i=1 Thus the mean of the transformed variables corresponds to the geometric mean of the original variables. 10
11 3. Exponential Distribution Definition: A nonnegative rv X is said to have exponential distribution with parameter λ if the pdf of X is { 1 f(x) = λ e x/λ if x 0 0 if x 0. The cdf of X is F (x) = P (X x) = 1 e x/λ for x 0. Proposition: The mean and variance of X can be shown to be E(X) = λ, Var(X) = λ 2. 11
12 DISTRIBUTIONS DERIVED FROM THE NORMAL DISTRIBUTION Definition: A random variable X with pdf g(x) = λα Γ(α) xα 1 e λx x 0 has gamma distribution with parameters α > 0 and λ > 0. The gamma function Γ(x), is defined as Γ(x) = Properties of the Gamma Function: (i) Γ(x + 1) = xγ(x) (ii) Γ(n + 1) = n! (iii) Γ(1/2) = π. 0 u x 1 e u du. Remarks: 1. Notice that an exponential rv with parameter 1/θ = λ is a special case of a gamma rv. with parameters α = 1 and λ. 2. The sum of n independent identically distributed (iid) exponential rv, with parameter λ has a gamma distribution, with parameters n and λ. 3. The sum of n iid gamma rv with parameters α and λ has gamma distribution with parameters nα and λ. Definition: If Z is a standard normal rv, the distribution of U = Z 2 called the chi-square distribution with 1 degree of freedom. The density function of U χ 2 1 is f U (x) = x 1/2 2π e x/2, x > 0. Remark: A χ 2 1 random variable has the same density as a random variable with gamma distribution, with parameters α = 1/2 and λ = 1/2. Definition: If U 1, U 2,..., U k are independent chi-square rv-s with 1 degree of freedom, the distribution of V = U 1 + U U k is called the chi-square distribution with k degrees of freedom. Using Remark 3. and the above remark, a χ 2 k rv. follows gamma distribution with parameters α = k/2 and λ = 1/2. Thus the density function of V χ 2 k is: f V (x) = 1 2 k/2 Γ(k/2) xk/2 1 e x/2, x > 0 Proposition: If V has a chi-square distribution with k degree of freedom, then E(V ) = k, Var(V ) = 2k. 12
13 Definition: If Z N (0, 1) and U χ 2 n and Z and U are independent, then the distribution of Z U/n is called the t distribution with n degrees of freedom. Proposition: The density function of the t distribution with n degrees of freedom is f(t) = ( ) (n+1)/2 Γ[(n + 1)/2] 1 + t2. nπ Γ(n/2) n Remarks: For the above density f(t) = f( t), so the t density is symmetric about zero. As the number of degrees of freedom approaches infinity, the t distribution tends to be standard normal. Definition: Let U and V be independent chi-square variables with m and n degrees of freedom, respectively. The distribution of W = U/m V/n is called F distribution with m and n degrees of freedom and is denoted by F m,n. Remarks: (i) If T t n, then T 2 F 1,n. (ii) If X F n,m, then X 1 F m,n. 13
14 COVARIANCE and CORRELATION of RANDOM VARIABLES Definition: Let X and Y are random variables with expected values µ X and µ Y, respectively. The covariance of X and Y is provided that the expectation exists. Proposition: Cov(X, Y ) = E[(X µ X )(Y µ Y )], Cov(X, Y ) = E(XY ) E(X)E(Y ) Proof: By definition Cov(X, Y ) = E[(X µ X )(Y µ Y )] = E(XY Xµ Y Y µ X + µ X µ Y ) = = E(XY ) µ X µ Y µ X µ Y + µ X µ Y = E(XY ) E(X)E(Y ). Proposition: (i) If X and Y are independent random variables (ii) If X = Y with Var(X) = Var(Y ) = σ 2 Cov(X, Y ) = 0. Cov(X, Y ) = Var(X) = σ 2 Definition: If X and Y are random variables and the variances and covariances are exist and the variances are nonzero, then the correlation of X and Y, denoted by ρ, is ρ = Cor(X, Y ) = Cov(X, Y ) Var(X)Var(Y ) Proposition: (i) 1 ρ 1. (ii) ρ = ±1 if and only if X = a + by for some constants a and b. Proposition: Let X and Y are arbitrary random variables and the variances and covariances exist. Then Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X, Y ) Proof: Var(X + Y ) = E[(X + Y µ X µ Y ) 2 ] = E[((X µ X ) + (Y µ Y )) 2 ] = = E[(X µ X ) 2 + (Y µ Y ) 2 + 2(X µ X )(Y µ Y )] = = Var(X) + Var(Y ) + 2Cov(X, Y ). 14
5. Continuous Random Variables
5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be
More informationDefinition: Suppose that two random variables, either continuous or discrete, X and Y have joint density
HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,
More informationFor a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )
Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll
More informationSummary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)
Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume
More informationST 371 (IV): Discrete Random Variables
ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible
More informationChapter 5. Random variables
Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like
More informationLecture Notes 1. Brief Review of Basic Probability
Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very
More informationChapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,
More informationLecture 6: Discrete & Continuous Probability and Random Variables
Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September
More information3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.
3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately
More informationRandom variables, probability distributions, binomial random variable
Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that
More informationWhat is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference
0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures
More information1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles...
MATH4427 Notebook 1 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 1 MATH4427 Notebook 1 3 1.1 Introduction, and Review of Probability
More informationChapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution
Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover
More informationMath 431 An Introduction to Probability. Final Exam Solutions
Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <
More informationNotes on Continuous Random Variables
Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes
More informationOverview of Monte Carlo Simulation, Probability Review and Introduction to Matlab
Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?
More informationWHERE DOES THE 10% CONDITION COME FROM?
1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationLecture 7: Continuous Random Variables
Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider
More informationSection 5.1 Continuous Random Variables: Introduction
Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,
More informationJoint Exam 1/P Sample Exam 1
Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question
More informationRandom variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.
Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()
More informationIntroduction to Probability
Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence
More informationUNIT I: RANDOM VARIABLES PART- A -TWO MARKS
UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0
More informationA Tutorial on Probability Theory
Paola Sebastiani Department of Mathematics and Statistics University of Massachusetts at Amherst Corresponding Author: Paola Sebastiani. Department of Mathematics and Statistics, University of Massachusetts,
More informationRandom Variables. Chapter 2. Random Variables 1
Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets
More informationLecture 3: Continuous distributions, expected value & mean, variance, the normal distribution
Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution 8 October 2007 In this lecture we ll learn the following: 1. how continuous probability distributions differ
More informationProbability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur
Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce
More informationFEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL
FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint
More informationProbability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X
Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random
More informationSTAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE
STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about
More informationNormal distribution. ) 2 /2σ. 2π σ
Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a
More informationLecture 8. Confidence intervals and the central limit theorem
Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of
More information2WB05 Simulation Lecture 8: Generating random variables
2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating
More informationIEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem
IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have
More informationThe sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].
Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real
More informationChapter 5. Discrete Probability Distributions
Chapter 5. Discrete Probability Distributions Chapter Problem: Did Mendel s result from plant hybridization experiments contradicts his theory? 1. Mendel s theory says that when there are two inheritable
More informationSums of Independent Random Variables
Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables
More information32. PROBABILITY P(A B)
32. PROBABILITY 32. Probability 1 Revised September 2011 by G. Cowan (RHUL). 32.1. General [1 8] An abstract definition of probability can be given by considering a set S, called the sample space, and
More informationMAS108 Probability I
1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper
More informationYou flip a fair coin four times, what is the probability that you obtain three heads.
Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.
More informationImportant Probability Distributions OPRE 6301
Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.
More informationProbability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..
Probability Theory A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T.. Florian Herzog 2013 Probability space Probability space A probability space W is a unique triple W = {Ω, F,
More informationChapter 3 RANDOM VARIATE GENERATION
Chapter 3 RANDOM VARIATE GENERATION In order to do a Monte Carlo simulation either by hand or by computer, techniques must be developed for generating values of random variables having known distributions.
More informationPROBABILITY AND SAMPLING DISTRIBUTIONS
PROBABILITY AND SAMPLING DISTRIBUTIONS SEEMA JAGGI AND P.K. BATRA Indian Agricultural Statistics Research Institute Library Avenue, New Delhi - 0 0 seema@iasri.res.in. Introduction The concept of probability
More informationAn Introduction to Basic Statistics and Probability
An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random
More informationMAT 211 Introduction to Business Statistics I Lecture Notes
MAT 211 Introduction to Business Statistics I Lecture Notes Muhammad El-Taha Department of Mathematics and Statistics University of Southern Maine 96 Falmouth Street Portland, ME 04104-9300 MAT 211, Spring
More information4. Continuous Random Variables, the Pareto and Normal Distributions
4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random
More informationProbability and Statistics Vocabulary List (Definitions for Middle School Teachers)
Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) B Bar graph a diagram representing the frequency distribution for nominal or discrete data. It consists of a sequence
More informationCHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.
Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,
More informationMaster s Theory Exam Spring 2006
Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem
More information16. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION
6. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION It is sometimes difficult to directly compute probabilities for a binomial (n, p) random variable, X. We need a different table for each value of
More informationE3: PROBABILITY AND STATISTICS lecture notes
E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................
More informatione.g. arrival of a customer to a service station or breakdown of a component in some system.
Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be
More informationMath 461 Fall 2006 Test 2 Solutions
Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two
More informationSOLUTIONS. f x = 6x 2 6xy 24x, f y = 3x 2 6y. To find the critical points, we solve
SOLUTIONS Problem. Find the critical points of the function f(x, y = 2x 3 3x 2 y 2x 2 3y 2 and determine their type i.e. local min/local max/saddle point. Are there any global min/max? Partial derivatives
More informationMeasurements of central tendency express whether the numbers tend to be high or low. The most common of these are:
A PRIMER IN PROBABILITY This handout is intended to refresh you on the elements of probability and statistics that are relevant for econometric analysis. In order to help you prioritize the information
More informationECE302 Spring 2006 HW3 Solutions February 2, 2006 1
ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in
More informationLecture 13: Martingales
Lecture 13: Martingales 1. Definition of a Martingale 1.1 Filtrations 1.2 Definition of a martingale and its basic properties 1.3 Sums of independent random variables and related models 1.4 Products of
More informationStatistics I for QBIC. Contents and Objectives. Chapters 1 7. Revised: August 2013
Statistics I for QBIC Text Book: Biostatistics, 10 th edition, by Daniel & Cross Contents and Objectives Chapters 1 7 Revised: August 2013 Chapter 1: Nature of Statistics (sections 1.1-1.6) Objectives
More informationSome special discrete probability distributions
University of California, Los Angeles Department of Statistics Statistics 100A Instructor: Nicolas Christou Some special discrete probability distributions Bernoulli random variable: It is a variable that
More informationSection 5 Part 2. Probability Distributions for Discrete Random Variables
Section 5 Part 2 Probability Distributions for Discrete Random Variables Review and Overview So far we ve covered the following probability and probability distribution topics Probability rules Probability
More informationStatistics 100A Homework 7 Solutions
Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase
More informationMath 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions
Math 70/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 2 Solutions About this problem set: These are problems from Course /P actuarial exams that I have collected over the years,
More informationAggregate Loss Models
Aggregate Loss Models Chapter 9 Stat 477 - Loss Models Chapter 9 (Stat 477) Aggregate Loss Models Brian Hartman - BYU 1 / 22 Objectives Objectives Individual risk model Collective risk model Computing
More informationBNG 202 Biomechanics Lab. Descriptive statistics and probability distributions I
BNG 202 Biomechanics Lab Descriptive statistics and probability distributions I Overview The overall goal of this short course in statistics is to provide an introduction to descriptive and inferential
More informationA review of the portions of probability useful for understanding experimental design and analysis.
Chapter 3 Review of Probability A review of the portions of probability useful for understanding experimental design and analysis. The material in this section is intended as a review of the topic of probability
More informationExploratory Data Analysis
Exploratory Data Analysis Johannes Schauer johannes.schauer@tugraz.at Institute of Statistics Graz University of Technology Steyrergasse 17/IV, 8010 Graz www.statistics.tugraz.at February 12, 2008 Introduction
More informationInstitute of Actuaries of India Subject CT3 Probability and Mathematical Statistics
Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics For 2015 Examinations Aim The aim of the Probability and Mathematical Statistics subject is to provide a grounding in
More informationGenerating Random Numbers Variance Reduction Quasi-Monte Carlo. Simulation Methods. Leonid Kogan. MIT, Sloan. 15.450, Fall 2010
Simulation Methods Leonid Kogan MIT, Sloan 15.450, Fall 2010 c Leonid Kogan ( MIT, Sloan ) Simulation Methods 15.450, Fall 2010 1 / 35 Outline 1 Generating Random Numbers 2 Variance Reduction 3 Quasi-Monte
More informationExample. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)
: Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest
More informationHomework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.
Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the
More informationProbability Calculator
Chapter 95 Introduction Most statisticians have a set of probability tables that they refer to in doing their statistical wor. This procedure provides you with a set of electronic statistical tables that
More informationSCHOOL OF ENGINEERING & BUILT ENVIRONMENT. Mathematics
SCHOOL OF ENGINEERING & BUILT ENVIRONMENT Mathematics Probability and Probability Distributions 1. Introduction 2. Probability 3. Basic rules of probability 4. Complementary events 5. Addition Law for
More informationMath 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)
Math 370, Actuarial Problemsolving Spring 008 A.J. Hildebrand Practice Test, 1/8/008 (with solutions) About this test. This is a practice test made up of a random collection of 0 problems from past Course
More information0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) =
. A mail-order computer business has si telephone lines. Let X denote the number of lines in use at a specified time. Suppose the pmf of X is as given in the accompanying table. 0 2 3 4 5 6 p(.0.5.20.25.20.06.04
More information6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS
6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total
More informationReview of Random Variables
Chapter 1 Review of Random Variables Updated: January 16, 2015 This chapter reviews basic probability concepts that are necessary for the modeling and statistical analysis of financial data. 1.1 Random
More informationStatistics 100A Homework 4 Solutions
Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation
More informationCovariance and Correlation
Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such
More informationStat 704 Data Analysis I Probability Review
1 / 30 Stat 704 Data Analysis I Probability Review Timothy Hanson Department of Statistics, University of South Carolina Course information 2 / 30 Logistics: Tuesday/Thursday 11:40am to 12:55pm in LeConte
More informationMath 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 5 Solutions
Math 370/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 5 Solutions About this problem set: These are problems from Course 1/P actuarial exams that I have collected over the
More information6 PROBABILITY GENERATING FUNCTIONS
6 PROBABILITY GENERATING FUNCTIONS Certain derivations presented in this course have been somewhat heavy on algebra. For example, determining the expectation of the Binomial distribution (page 5.1 turned
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability
CS 7 Discrete Mathematics and Probability Theory Fall 29 Satish Rao, David Tse Note 8 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces
More information1 Sufficient statistics
1 Sufficient statistics A statistic is a function T = rx 1, X 2,, X n of the random sample X 1, X 2,, X n. Examples are X n = 1 n s 2 = = X i, 1 n 1 the sample mean X i X n 2, the sample variance T 1 =
More informationSTT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables
Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables Discrete vs. continuous random variables Examples of continuous distributions o Uniform o Exponential o Normal Recall: A random
More informationThe Kelly Betting System for Favorable Games.
The Kelly Betting System for Favorable Games. Thomas Ferguson, Statistics Department, UCLA A Simple Example. Suppose that each day you are offered a gamble with probability 2/3 of winning and probability
More informationCA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction
CA200 Quantitative Analysis for Business Decisions File name: CA200_Section_04A_StatisticsIntroduction Table of Contents 4. Introduction to Statistics... 1 4.1 Overview... 3 4.2 Discrete or continuous
More informationMath/Stats 342: Solutions to Homework
Math/Stats 342: Solutions to Homework Steven Miller (sjm1@williams.edu) November 17, 2011 Abstract Below are solutions / sketches of solutions to the homework problems from Math/Stats 342: Probability
More information6.2. Discrete Probability Distributions
6.2. Discrete Probability Distributions Discrete Uniform distribution (diskreetti tasajakauma) A random variable X follows the dicrete uniform distribution on the interval [a, a+1,..., b], if it may attain
More informationPractice problems for Homework 11 - Point Estimation
Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:
More information2. Discrete random variables
2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete
More informationMath 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions
Math 370, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the
More informationThe normal approximation to the binomial
The normal approximation to the binomial The binomial probability function is not useful for calculating probabilities when the number of trials n is large, as it involves multiplying a potentially very
More informationExponential Distribution
Exponential Distribution Definition: Exponential distribution with parameter λ: { λe λx x 0 f(x) = 0 x < 0 The cdf: F(x) = x Mean E(X) = 1/λ. f(x)dx = Moment generating function: φ(t) = E[e tx ] = { 1
More informationChapter 4. Probability and Probability Distributions
Chapter 4. robability and robability Distributions Importance of Knowing robability To know whether a sample is not identical to the population from which it was selected, it is necessary to assess the
More informationData Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1
Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2011 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields
More informationUniversity of California, Los Angeles Department of Statistics. Random variables
University of California, Los Angeles Department of Statistics Statistics Instructor: Nicolas Christou Random variables Discrete random variables. Continuous random variables. Discrete random variables.
More information