Continuous Random Variables

Size: px
Start display at page:

Download "Continuous Random Variables"

Transcription

1 Continuous Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Continuous Random Variables 2 11 Introduction 2 12 Probability Density Functions 3 13 Transformations 5 2 Mean, Variance and Quantiles 6 21 Epectation 6 22 Variance 6 23 Quantiles 7 3 Continuous Distributions 8 31 Uniform 8 32 Eponential 9 33 Gaussian 11 1

2 1 Continuous Random Variables 11 Introduction Definition Suppose again we have a random eperiment with sample space S and probability measure P Recall our definition of a random variable as a mapping X : S R from the sample space S to the real numbers inducing a probability measure P X (B) = P{X 1 (B)}, B R We define the random variable X to be (absolutely) continuous if f X : R R st P X (B) = B f X ()d, B R, (1) in which case f X is referred to as the probability density function (pdf) of X Comments A connected sequence of comments: One consequence of this definition is that the probability of any singleton set B = {}, R is zero for a continuous random variable, P X (X = ) = P X ({}) = 0 This in turn implies that any countable set B = { 1, 2, } R will have zero probability measure for a continuous random variable, since P X (X B) = P X (X = 1 ) + P X (X = 2 ) + This automatically implies that the range of a continuous random variable will be uncountable This tells us that a random variable cannot be both discrete and continuous Eamples The following quantities would typically be modelled with continuous random variables They are measurements of time, distance and other phenomena that can, at least in theory, be determined to an arbitrarily high degree of accuracy The height or weight of a randomly chosen individual from a population The duration of this lecture The volume of fuel consumed by a bus on its route The total distance driven by a tai cab in a day Note that there are obvious ways to discretise all of the above eamples 2

3 12 Probability Density Functions pdf as the derivative of the cdf From (1), we notice the cumulative distribution function (cdf) for a continuous random variable X is therefore given by F X () = f X (t)dt, R, This epression leads us to a definition of the pdf of a continuous rv X for which we already have the cdf; by the Fundamental Theorem of Calculus we find the pdf of X to be given by f X () = d d F X() or F X() Properties of a pdf Since the pdf is the derivative of the cdf, and because we know that a cdf is non-decreasing, this tells us the pdf will always be non-negative So, in the same way as we did for cdfs and discrete pmfs, we have the following checklist to ensure f X is a valid pdf pdf: 1 f X () 0, R; 2 f X ()d = 1 Interval Probabilities Suppose we are interested in whether continuous a rv X lies in an interval (a, b] From the definition of a continuous random variable, this is given by P X (a < X b) = That is, the area under the pdf between a and b b a f X ()d 3

4 f() P(a<X<b) a b Further Comments: Besides still being a non-decreasing function satisfying F X () = 0, F X ( ) = 1, the cdf F X of a continuous random variable X is also (absolutely) continuous For a continuous rv since, P(X = ) = 0, F () = P(X ) P(X < ) For small δ, f X ()δ is approimately the probability that X takes a value in the small interval, say, [, + δ) Since the density (pdf) f X () is not itself a probability, then unlike the pmf of a discrete rv we do not require f X () 1 From (1) it is clear that the pdf of a continuous rv X completely characterises its distribution, so we often just specify f X Eample Suppose we have a continuous random variable X with probability density function { c f () = 2, 0 < < 3 0, otherwise for some unknown constant c Questions 1 Determine c 2 Find the cdf of X 3 Calculate P(1 < X < 2) Solutions 1 To find c: We must have 1 = c = 1 9 f ()d = 3 0 [ c 2 3 d = c 3 ] 3 0 = 9c 4

5 0, < 0 2 F() = f (u)du = u du = 27, 0 3 1, > 3 3 P(1 < X < 2) = F(2) F(1) = = f() pdf F() cdf Transformations Transforming random variables Suppose we have a continuous random variable X and wish to consider the transformed random variable Y = g(x) for some function g : R R, st g is continuous and strictly monotonic (so g 1 eists) Suppose g is monotonic increasing Then for y R, Y y X g 1 (y) So F Y (y) = P Y (Y y) = P X (X g 1 (y)) = F X (g 1 (y)) By the chain rule of differentiation, f Y (y) = F Y (y) = f X{g 1 (y)}g 1 (y) Note g 1 (y) = d dy g 1 (y) is positive since we assumed g was increasing If we had g monotonic decreasing, Y y X g 1 (y) and So by comparison with before, we would have with g 1 (y) always negative So overall, for Y = g(x) we have F Y (y) = P X (X g 1 (y)) = 1 F X (g 1 (y)) f Y (y) = F Y (y) = f X{g 1 (y)}g 1 (y) f Y (y) = f X {g 1 (y)} g 1 (y) (2) 5

6 2 Mean, Variance and Quantiles 21 Epectation E(X) For a continuous random variable X we define the mean or epectation of X, µ X or E X (X) = f X ()d More generally, for a function of interest of the random variable g : R R we have E X {g(x)} = g() f X ()d Linearity of Epectation Clearly, for continuous random variables we again have linearity of epectation E(aX + b) = ae(x) + b, a, b R, and that for two functions g, h : R R, we have additivity of epectation 22 Variance E{g(X) + h(x)} = E{g(X)} + E{h(X)} Var(X) The variance of a continuous random variable X is given by σ 2 X or Var X(X) = E{(X µ X ) 2 } = ( µ X ) 2 f X ()d and again it is easy to show that Var X (X) = 2 f X ()d µ 2 X = E(X 2 ) {E(X)} 2 R For a linear transformation ax + b we again have Var(aX + b) = a 2 Var(X), a, b 6

7 23 Quantiles Q X (α) Recall we defined the lower and upper quartiles and median of a sample of data as points (¼,¾,½)-way through the ordered sample as For α [0, 1] and a continuous random variable X we define the α-quantile of X, Q X (α), Q X (α) = min q R {q : F X(q) = α} If F X is invertible then Q X (α) = FX 1(α) In particular the median of a random variable X is Q X (05) That is, the median is a solution to the equation F X () = 1 2 Eample (continued) Again suppose we have a continuous random variable X with probability density function given by f () = { 2 /9, 0 < < 3 0, otherwise Questions 1 Calculate E(X) 2 Calculate Var(X) 3 Calculate the median of X Solutions 1 E(X) = 2 E(X 2 ) = f ()d = 2 f ()d = d = d = 45 = = = = 54 So Var(X) = E(X 2 ) {E(X)} 2 = = From earlier, F() = 3, for 0 < < 3 27 Setting F() = 1 2 median and solving, we get 3 27 = 1 2 = = for the 7

8 3 Continuous Distributions 31 Uniform U(a, b) Suppose X is a continuous random variable with probability density function f () = { 1 b a, a < < b 0, otherwise, and hence corresponding cumulative distribution function 0, a a F() = b a, a < < b 1, b Then X is said to follow a uniform distribution on the interval (a, b) and we write X U(a, b) Eample: U(0,1) 1 f () F() 0 1 Notice from the cdf that the quantiles of U(0,1) are the special case where Q(α) = α Relationship between U(a, b) and U(0,1) Suppose X U(0, 1), so F X () =, 0 1 For a < b R, if Y = a + (b a)x then Y U(a, b) 0 1 X a Y b Proof: We first observe that for any y (a, b), Y y a + (b a)x y X y a b a ( From this we find Y U(a, b), since F Y (y) = P(Y y) = P X y a ) b a y a b a = F X ( y a b a ) = 8

9 Mean and Variance of U(a, b) To find the mean of X U(a, b), E(X) = f ()d = b a [ 1 b a d = 2 2(b a) = b2 a 2 (b a)(b + a) = = a + b 2(b a) 2(b a) 2 Similarly we get Var(X) = E(X 2 ) E(X) 2 = (b a)2 12, so 32 Eponential µ = a + b 2, σ2 = (b a)2 12 Ep(λ) Suppose now X is a random variable taking value on R + = [0, ) with pdf for some λ > 0 f () = λe λ, 0, Then X is said to follow an eponential distribution with rate parameter λ and we write X Ep(λ) Straightforward integration between 0 and leads to the cdf, F() = 1 e λ, 0 ] b a The mean and variance are given by µ = 1 λ, σ2 = 1 λ 2 Eample: Ep(1), Ep(05) & Ep(02) pdfs f() λ = 1 λ = 05 λ =

10 Eample: Ep(02), Ep(05) & Ep(1) cdfs F() Lack of Memory Property First notice that from the eponential distribution cdf equation we have P(X > ) = e λ An important (and not always desirable) characteristic of the eponential distribution is the so called lack of memory property For, s > 0, consider the conditional probability P(X > + s X > s) of the additional magnitude of an eponentially distributed random variable given we already know it is greater than s Well P(X > + s) P(X > + s X > s) =, P(X > s) which, when X Ep(λ), gives P(X > + s X > s) = e λ(+s) e λs = e λ, again an eponential distribution with parameter λ So if we think of the eponential variable as the time to an event, then knowledge that we have waited time s for the event tells us nothing about how much longer we will have to wait - the process has no memory Eamples Eponential random variables are often used to model the time until occurrence of a random event where there is an assumed constant risk (λ) of the event happening over time, and so are frequently used as a simplest model, for eample, in reliability analysis So eamples include: the time to failure of a component in a system; the time until we find the net mistake on my slides; the distance we travel along a road until we find the net pothole; the time until the net jobs arrives at a database server; 10

11 Link with Poisson Distribution Notice the duality between some of the eponential rv eamples and those we saw for a Poisson distribution In each case, number of events has been replaced with time between events Claim: If events in a random process occur according to a Poisson distribution with rate λ then the time between events has an eponential distribution with rate parameter λ Proof: Suppose we have some random event process such that > 0, the number of events occurring in [0, ], N, follows a Poisson distribution with rate parameter λ, so N Poi(λ) Such a process is known as an homogeneous Poisson process Let X be the time until the first event of this process arrives Then we notice that P(X > ) P(N = 0) = (λ)0 e λ 0! = e λ and hence X Ep(λ) The same argument applies for all subsequent inter-arrival times 33 Gaussian N(µ, σ 2 ) Suppose X is a random variable taking value on R with pdf f () = 1 } { σ 2π ep ( µ)2 2σ 2, for some µ R, σ > 0 Then X is said to follow a Gaussian or normal distribution with mean µ and variance σ 2, and we write X N(µ, σ 2 ) The cdf of X N(µ, σ 2 ) is not analytically tractable for any (µ, σ), so we can only write F() = 1 } σ (t µ)2 ep { 2π 2σ 2 dt 11

12 Eample: N(0,1), N(2,1) & N(0,4) pdfs f() N(0,1) N(0,4) N(2,1) Eample: N(0,1), N(2,1) & N(0,4) cdfs F() N(0, 1) Setting µ = 0, σ = 1 and Z N(0, 1) gives the special case of the standard normal, with simplified density f (z) φ(z) = 1 2π e z2 2 Again for the cdf, we can only write F(z) Φ(z) = 1 2π z e t2 2 dt Statistical Tables Since the cdf, and therefore any probabilities, associated with a normal distribution are not analytically available, numerical integration procedures are used to find approimate probabilities 12

13 In particular, statistical tables contain values of the standard normal cdf Φ(z) for a range of values z R, and the quantiles Φ 1 (α) for a range of values α (0, 1) Linear interpolation is used for approimation between the tabulated values But why just tabulate N(0, 1)? We will now see how all normal distribution probabilities can be related back to probabilities from a standard normal distribution Linear Transformations of Normal Random Variables Suppose we have X N(µ, σ 2 ) Then it is also true that for any constants a, b R, the linear combination ax + b also follows a normal distribution More precisely, X N(µ, σ 2 ) ax + b N(aµ + b, a 2 σ 2 ), a, b R (Note that the mean and variance parameters of this transformed distribution follow from the general results for epectation and variance of any random variable under linear transformation) In particular, this allows us to standardise any normal rv, X N(µ, σ 2 ) X µ σ N(0, 1) Standardising Normal Random Variables So if X N(µ, σ 2 ) and we set Z = X µ, then since σ > 0 we can first observe that for σ any R, X X µ µ σ σ Z µ σ Therefore we can write the cdf of X in terms of Φ, ( F X () = P(X ) = P Z µ ) σ ( ) µ = Φ σ Table of Φ z Φ(z) z Φ(z) z Φ(z) z Φ(z)

14 Using Table of Φ First of all notice that Φ(z) has been tabulated for z > 0 This is because the standard normal pdf φ is symmetric about 0, so φ( z) = φ(z) For the cdf Φ, this means Φ(z) = 1 Φ( z) So for eample, Φ( 12) = 1 Φ(12) = 0115 Similarly, if Z N(0, 1) and we want P(Z > z), then for eample P(Z > 15) = 1 P(Z 15) = 1 Φ(15) Important Quantiles of N(0, 1) We will often have cause to use the 975% and 995% quantiles of N(0, 1), given by Φ 1 (0975) and Φ 1 (0995) Φ(196) 975% So with 95% probability an N(0, 1) rv will lie in [ 196, 196] ( [ 2, 2]) Φ(258) = 995% So with 99% probability an N(0, 1) rv will lie in [ 258, 258] More generally, for α (0, 1) and defining z 1 α/2 to be the (1 α/2) quantile of N(0, 1), if Z N(0, 1) then P Z (Z [ z 1 α/2, z 1 α/2 ]) = 1 α More generally still, if X N(µ, σ 2 ), then P X (X [µ σz 1 α/2, µ + σz 1 α/2 ]) = 1 α, and hence [µ σz 1 α/2, µ + σz 1 α/2 ] gives a (1 α) probability region for X centred around µ This can be rewritten as P X ( X µ σz 1 α/2 ) = 1 α Eample An analogue signal received at a detector (measured in microvolts) may be modelled as a Gaussian random variable X N(200, 256) 1 What is the probability that the signal will eceed 240µV? 2 What is the probability that the signal is larger than 240µV given that it is greater than 210µV? Solutions: 1 P(X > 240) = 1 P(X 240) = 1 Φ ( ) = 1 Φ(25)

15 ( ) P(X > 240) 2 P(X > 240 X > 210) = P(X > 210) = 1 Φ ( 256 ) Φ The Central Limit Theorem Let X 1, X 2,, X n be n independent and identically distributed (iid) random variables from any probability distribution, each with mean µ and variance σ 2 ( n ) ( n ) ( n ) From before we know E X i = nµ,var X i = nσ 2 First notice E X i nµ = i=1 i=1 i=1 0, Var ( n X i nµ i=1 ) = nσ 2 Dividing by ( n ) ( nσ, E i=1 X i nµ n ) = 0, Var i=1 X i nµ = nσ nσ 1 But we can now present the following, astonishing result i=1 n lim X i nµ Φ n nσ This can also be written as lim n X µ σ/ n Φ, where X = n i=1 X i n Or finally, for large n we have approimately or X N ) (µ, σ2, n n X i N ( nµ, nσ 2) i=1 We note here that although all these approimate distributional results hold irrespective of the distribution of the {X i }, in the special case where X i N(µ, σ 2 ) these distributional results are, in fact, eact This is because the sum of independent normally distributed random variables is also normally distributed Eample Consider the most simple eample, that X 1, X 2, are iid Bernoulli(p) discrete random variables taking value 0 or 1 Then the {X i } each have mean µ = p and variance σ 2 = p(1 p) By definition, we know that for any n, n X i Binomial(n, p) i=1 15

16 But now, by the Central Limit Theorem (CLT), we also have for large n that approimately So for large n n X i N ( nµ, nσ 2) N(np, np(1 p)) i=1 Binomial(n, p) N(np, np(1 p)) Notice that the LHS is a discrete distribution, and the RHS is a continuous distribution Binomial(10,½) pmf & N(5,25) pdf p() Binomial(10,05) f() N(5,25) Binomial(100,½) pmf & N(50,25) pdf p() Binomial(100,05) f() N(50,25)

17 Binomial(1000,½) pmf & N(500,250) pdf Binomial(1000,05) N(500,250) p() f() So suppose X was the number of heads found on 1000 tosses of a fair coin, and we were interested in P(X 490) Using the binomial distribution pmf, we would need to calculate P(X 490) = p X (0) + p X (1) + p X (2) + + p X (490) (!) ( 027) ( However, using ) the CLT we have approimately X N(500, 250) and so P(X 490) Φ = Φ( 0632) = 1 Φ(0632) Log-Normal Distribution Suppose X N(µ, σ 2 ), and consider the transformation Y = e X Then if g() = e, g 1 (y) = log(y) and g 1 (y) = 1 y Then by (2) we have f Y (y) = and we say Y follows a log-normal distribution Eample: LN(0,1), LN(2,1) & LN(0,4) pdfs ] 1 [ σy 2π ep {log(y) µ}2 2σ 2, y > 0, f() LN(0,4) LN(0,1) LN(2,1)

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

Normal distribution. ) 2 /2σ. 2π σ

Normal distribution. ) 2 /2σ. 2π σ Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

( ) is proportional to ( 10 + x)!2. Calculate the

( ) is proportional to ( 10 + x)!2. Calculate the PRACTICE EXAMINATION NUMBER 6. An insurance company eamines its pool of auto insurance customers and gathers the following information: i) All customers insure at least one car. ii) 64 of the customers

More information

4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions 4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

More information

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

Transformations and Expectations of random variables

Transformations and Expectations of random variables Transformations and Epectations of random variables X F X (): a random variable X distributed with CDF F X. Any function Y = g(x) is also a random variable. If both X, and Y are continuous random variables,

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

CA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction

CA200 Quantitative Analysis for Business Decisions. File name: CA200_Section_04A_StatisticsIntroduction CA200 Quantitative Analysis for Business Decisions File name: CA200_Section_04A_StatisticsIntroduction Table of Contents 4. Introduction to Statistics... 1 4.1 Overview... 3 4.2 Discrete or continuous

More information

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random

More information

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) =

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) = . A mail-order computer business has si telephone lines. Let X denote the number of lines in use at a specified time. Suppose the pmf of X is as given in the accompanying table. 0 2 3 4 5 6 p(.0.5.20.25.20.06.04

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Lecture 8: More Continuous Random Variables

Lecture 8: More Continuous Random Variables Lecture 8: More Continuous Random Variables 26 September 2005 Last time: the eponential. Going from saying the density e λ, to f() λe λ, to the CDF F () e λ. Pictures of the pdf and CDF. Today: the Gaussian

More information

Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

More information

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

Lecture 8. Confidence intervals and the central limit theorem

Lecture 8. Confidence intervals and the central limit theorem Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of

More information

Stats on the TI 83 and TI 84 Calculator

Stats on the TI 83 and TI 84 Calculator Stats on the TI 83 and TI 84 Calculator Entering the sample values STAT button Left bracket { Right bracket } Store (STO) List L1 Comma Enter Example: Sample data are {5, 10, 15, 20} 1. Press 2 ND and

More information

STAT 830 Convergence in Distribution

STAT 830 Convergence in Distribution STAT 830 Convergence in Distribution Richard Lockhart Simon Fraser University STAT 830 Fall 2011 Richard Lockhart (Simon Fraser University) STAT 830 Convergence in Distribution STAT 830 Fall 2011 1 / 31

More information

Chapter 9 Monté Carlo Simulation

Chapter 9 Monté Carlo Simulation MGS 3100 Business Analysis Chapter 9 Monté Carlo What Is? A model/process used to duplicate or mimic the real system Types of Models Physical simulation Computer simulation When to Use (Computer) Models?

More information

Stat 704 Data Analysis I Probability Review

Stat 704 Data Analysis I Probability Review 1 / 30 Stat 704 Data Analysis I Probability Review Timothy Hanson Department of Statistics, University of South Carolina Course information 2 / 30 Logistics: Tuesday/Thursday 11:40am to 12:55pm in LeConte

More information

Principle of Data Reduction

Principle of Data Reduction Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution 8 October 2007 In this lecture we ll learn the following: 1. how continuous probability distributions differ

More information

MAS108 Probability I

MAS108 Probability I 1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

More information

Random Variables. Chapter 2. Random Variables 1

Random Variables. Chapter 2. Random Variables 1 Random Variables Chapter 2 Random Variables 1 Roulette and Random Variables A Roulette wheel has 38 pockets. 18 of them are red and 18 are black; these are numbered from 1 to 36. The two remaining pockets

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Chapter 5 Discrete Probability Distribution. Learning objectives

Chapter 5 Discrete Probability Distribution. Learning objectives Chapter 5 Discrete Probability Distribution Slide 1 Learning objectives 1. Understand random variables and probability distributions. 1.1. Distinguish discrete and continuous random variables. 2. Able

More information

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about

More information

Order Statistics. Lecture 15: Order Statistics. Notation Detour. Order Statistics, cont.

Order Statistics. Lecture 15: Order Statistics. Notation Detour. Order Statistics, cont. Lecture 5: Statistics 4 Let X, X 2, X 3, X 4, X 5 be iid random variables with a distribution F with a range of a, b). We can relabel these X s such that their labels correspond to arranging them in increasing

More information

Probability, Mean and Median

Probability, Mean and Median Proaility, Mean and Median In the last section, we considered (proaility) density functions. We went on to discuss their relationship with cumulative distriution functions. The goal of this section is

More information

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process LECTURE 16 Readings: Section 5.1 Lecture outline Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process Number of successes Distribution of interarrival times The

More information

1 if 1 x 0 1 if 0 x 1

1 if 1 x 0 1 if 0 x 1 Chapter 3 Continuity In this chapter we begin by defining the fundamental notion of continuity for real valued functions of a single real variable. When trying to decide whether a given function is or

More information

. (3.3) n Note that supremum (3.2) must occur at one of the observed values x i or to the left of x i.

. (3.3) n Note that supremum (3.2) must occur at one of the observed values x i or to the left of x i. Chapter 3 Kolmogorov-Smirnov Tests There are many situations where experimenters need to know what is the distribution of the population of their interest. For example, if they want to use a parametric

More information

Master s Theory Exam Spring 2006

Master s Theory Exam Spring 2006 Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem

More information

Chapter 7 Probability

Chapter 7 Probability 7-8: Probability and Statistics Mat Dr. Firoz Chapter 7 Probability Definition: Probability is a real valued set function P that assigns to each event A in the sample space S a number A), called the probability

More information

2 Binomial, Poisson, Normal Distribution

2 Binomial, Poisson, Normal Distribution 2 Binomial, Poisson, Normal Distribution Binomial Distribution ): We are interested in the number of times an event A occurs in n independent trials. In each trial the event A has the same probability

More information

Practice with Proofs

Practice with Proofs Practice with Proofs October 6, 2014 Recall the following Definition 0.1. A function f is increasing if for every x, y in the domain of f, x < y = f(x) < f(y) 1. Prove that h(x) = x 3 is increasing, using

More information

The Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University

The Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University The Normal Distribution Alan T. Arnholt Department of Mathematical Sciences Appalachian State University arnholt@math.appstate.edu Spring 2006 R Notes 1 Copyright c 2006 Alan T. Arnholt 2 Continuous Random

More information

PSTAT 120B Probability and Statistics

PSTAT 120B Probability and Statistics - Week University of California, Santa Barbara April 10, 013 Discussion section for 10B Information about TA: Fang-I CHU Office: South Hall 5431 T Office hour: TBA email: chu@pstat.ucsb.edu Slides will

More information

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS

More information

Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day.

Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day. 16 The Exponential Distribution Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day. Let T be the time (in days) between hits. 2.

More information

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

More information

Chapter 5. Random variables

Chapter 5. Random variables Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

More information

6.4 Logarithmic Equations and Inequalities

6.4 Logarithmic Equations and Inequalities 6.4 Logarithmic Equations and Inequalities 459 6.4 Logarithmic Equations and Inequalities In Section 6.3 we solved equations and inequalities involving exponential functions using one of two basic strategies.

More information

MATH 425, PRACTICE FINAL EXAM SOLUTIONS.

MATH 425, PRACTICE FINAL EXAM SOLUTIONS. MATH 45, PRACTICE FINAL EXAM SOLUTIONS. Exercise. a Is the operator L defined on smooth functions of x, y by L u := u xx + cosu linear? b Does the answer change if we replace the operator L by the operator

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

The Exponential Distribution

The Exponential Distribution 21 The Exponential Distribution From Discrete-Time to Continuous-Time: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding

More information

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions Chapter 4 - Lecture 1 Probability Density Functions and Cumulative Distribution Functions October 21st, 2009 Review Probability distribution function Useful results Relationship between the pdf and the

More information

6 PROBABILITY GENERATING FUNCTIONS

6 PROBABILITY GENERATING FUNCTIONS 6 PROBABILITY GENERATING FUNCTIONS Certain derivations presented in this course have been somewhat heavy on algebra. For example, determining the expectation of the Binomial distribution (page 5.1 turned

More information

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1 ECE3 Spring 6 HW5 Solutions February 1, 6 1 Solutions to HW5 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics

More information

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability CS 7 Discrete Mathematics and Probability Theory Fall 29 Satish Rao, David Tse Note 8 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces

More information

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0

More information

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away) : Three bets Math 45 Introduction to Probability Lecture 5 Kenneth Harris aharri@umich.edu Department of Mathematics University of Michigan February, 009. A casino offers the following bets (the fairest

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

Exponential Distribution

Exponential Distribution Exponential Distribution Definition: Exponential distribution with parameter λ: { λe λx x 0 f(x) = 0 x < 0 The cdf: F(x) = x Mean E(X) = 1/λ. f(x)dx = Moment generating function: φ(t) = E[e tx ] = { 1

More information

APPLIED MATHEMATICS ADVANCED LEVEL

APPLIED MATHEMATICS ADVANCED LEVEL APPLIED MATHEMATICS ADVANCED LEVEL INTRODUCTION This syllabus serves to examine candidates knowledge and skills in introductory mathematical and statistical methods, and their applications. For applications

More information

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables Tony Pourmohamad Department of Mathematics De Anza College Spring 2015 Objectives By the end of this set of slides,

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

Absolute Value Equations and Inequalities

Absolute Value Equations and Inequalities . Absolute Value Equations and Inequalities. OBJECTIVES 1. Solve an absolute value equation in one variable. Solve an absolute value inequality in one variable NOTE Technically we mean the distance between

More information

Approximation of Aggregate Losses Using Simulation

Approximation of Aggregate Losses Using Simulation Journal of Mathematics and Statistics 6 (3): 233-239, 2010 ISSN 1549-3644 2010 Science Publications Approimation of Aggregate Losses Using Simulation Mohamed Amraja Mohamed, Ahmad Mahir Razali and Noriszura

More information

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions) Math 370, Actuarial Problemsolving Spring 008 A.J. Hildebrand Practice Test, 1/8/008 (with solutions) About this test. This is a practice test made up of a random collection of 0 problems from past Course

More information

Section 3-7. Marginal Analysis in Business and Economics. Marginal Cost, Revenue, and Profit. 202 Chapter 3 The Derivative

Section 3-7. Marginal Analysis in Business and Economics. Marginal Cost, Revenue, and Profit. 202 Chapter 3 The Derivative 202 Chapter 3 The Derivative Section 3-7 Marginal Analysis in Business and Economics Marginal Cost, Revenue, and Profit Application Marginal Average Cost, Revenue, and Profit Marginal Cost, Revenue, and

More information

Chapter 7 - Roots, Radicals, and Complex Numbers

Chapter 7 - Roots, Radicals, and Complex Numbers Math 233 - Spring 2009 Chapter 7 - Roots, Radicals, and Complex Numbers 7.1 Roots and Radicals 7.1.1 Notation and Terminology In the expression x the is called the radical sign. The expression under the

More information

THE CENTRAL LIMIT THEOREM TORONTO

THE CENTRAL LIMIT THEOREM TORONTO THE CENTRAL LIMIT THEOREM DANIEL RÜDT UNIVERSITY OF TORONTO MARCH, 2010 Contents 1 Introduction 1 2 Mathematical Background 3 3 The Central Limit Theorem 4 4 Examples 4 4.1 Roulette......................................

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved. 3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately

More information

Hypothesis Testing for Beginners

Hypothesis Testing for Beginners Hypothesis Testing for Beginners Michele Piffer LSE August, 2011 Michele Piffer (LSE) Hypothesis Testing for Beginners August, 2011 1 / 53 One year ago a friend asked me to put down some easy-to-read notes

More information

The Binomial Distribution

The Binomial Distribution The Binomial Distribution James H. Steiger November 10, 00 1 Topics for this Module 1. The Binomial Process. The Binomial Random Variable. The Binomial Distribution (a) Computing the Binomial pdf (b) Computing

More information

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is. Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,

More information

A Tutorial on Probability Theory

A Tutorial on Probability Theory Paola Sebastiani Department of Mathematics and Statistics University of Massachusetts at Amherst Corresponding Author: Paola Sebastiani. Department of Mathematics and Statistics, University of Massachusetts,

More information

Review of Random Variables

Review of Random Variables Chapter 1 Review of Random Variables Updated: January 16, 2015 This chapter reviews basic probability concepts that are necessary for the modeling and statistical analysis of financial data. 1.1 Random

More information

The Monte Carlo Framework, Examples from Finance and Generating Correlated Random Variables

The Monte Carlo Framework, Examples from Finance and Generating Correlated Random Variables Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh The Monte Carlo Framework, Examples from Finance and Generating Correlated Random Variables 1 The Monte Carlo Framework Suppose we wish

More information

Section 6.1 Joint Distribution Functions

Section 6.1 Joint Distribution Functions Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function

More information

Lecture 2: Discrete Distributions, Normal Distributions. Chapter 1

Lecture 2: Discrete Distributions, Normal Distributions. Chapter 1 Lecture 2: Discrete Distributions, Normal Distributions Chapter 1 Reminders Course website: www. stat.purdue.edu/~xuanyaoh/stat350 Office Hour: Mon 3:30-4:30, Wed 4-5 Bring a calculator, and copy Tables

More information

correct-choice plot f(x) and draw an approximate tangent line at x = a and use geometry to estimate its slope comment The choices were:

correct-choice plot f(x) and draw an approximate tangent line at x = a and use geometry to estimate its slope comment The choices were: Topic 1 2.1 mode MultipleSelection text How can we approximate the slope of the tangent line to f(x) at a point x = a? This is a Multiple selection question, so you need to check all of the answers that

More information

Statistics 100A Homework 7 Solutions

Statistics 100A Homework 7 Solutions Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

More information

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf AMS 3 Joe Mitchell Eamples: Joint Densities and Joint Mass Functions Eample : X and Y are jointl continuous with joint pdf f(,) { c 2 + 3 if, 2, otherwise. (a). Find c. (b). Find P(X + Y ). (c). Find marginal

More information

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions Math 70/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 2 Solutions About this problem set: These are problems from Course /P actuarial exams that I have collected over the years,

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information