Continuous Random Variables


 Amber Watts
 2 years ago
 Views:
Transcription
1 Continuous Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Continuous Random Variables 2 11 Introduction 2 12 Probability Density Functions 3 13 Transformations 5 2 Mean, Variance and Quantiles 6 21 Epectation 6 22 Variance 6 23 Quantiles 7 3 Continuous Distributions 8 31 Uniform 8 32 Eponential 9 33 Gaussian 11 1
2 1 Continuous Random Variables 11 Introduction Definition Suppose again we have a random eperiment with sample space S and probability measure P Recall our definition of a random variable as a mapping X : S R from the sample space S to the real numbers inducing a probability measure P X (B) = P{X 1 (B)}, B R We define the random variable X to be (absolutely) continuous if f X : R R st P X (B) = B f X ()d, B R, (1) in which case f X is referred to as the probability density function (pdf) of X Comments A connected sequence of comments: One consequence of this definition is that the probability of any singleton set B = {}, R is zero for a continuous random variable, P X (X = ) = P X ({}) = 0 This in turn implies that any countable set B = { 1, 2, } R will have zero probability measure for a continuous random variable, since P X (X B) = P X (X = 1 ) + P X (X = 2 ) + This automatically implies that the range of a continuous random variable will be uncountable This tells us that a random variable cannot be both discrete and continuous Eamples The following quantities would typically be modelled with continuous random variables They are measurements of time, distance and other phenomena that can, at least in theory, be determined to an arbitrarily high degree of accuracy The height or weight of a randomly chosen individual from a population The duration of this lecture The volume of fuel consumed by a bus on its route The total distance driven by a tai cab in a day Note that there are obvious ways to discretise all of the above eamples 2
3 12 Probability Density Functions pdf as the derivative of the cdf From (1), we notice the cumulative distribution function (cdf) for a continuous random variable X is therefore given by F X () = f X (t)dt, R, This epression leads us to a definition of the pdf of a continuous rv X for which we already have the cdf; by the Fundamental Theorem of Calculus we find the pdf of X to be given by f X () = d d F X() or F X() Properties of a pdf Since the pdf is the derivative of the cdf, and because we know that a cdf is nondecreasing, this tells us the pdf will always be nonnegative So, in the same way as we did for cdfs and discrete pmfs, we have the following checklist to ensure f X is a valid pdf pdf: 1 f X () 0, R; 2 f X ()d = 1 Interval Probabilities Suppose we are interested in whether continuous a rv X lies in an interval (a, b] From the definition of a continuous random variable, this is given by P X (a < X b) = That is, the area under the pdf between a and b b a f X ()d 3
4 f() P(a<X<b) a b Further Comments: Besides still being a nondecreasing function satisfying F X () = 0, F X ( ) = 1, the cdf F X of a continuous random variable X is also (absolutely) continuous For a continuous rv since, P(X = ) = 0, F () = P(X ) P(X < ) For small δ, f X ()δ is approimately the probability that X takes a value in the small interval, say, [, + δ) Since the density (pdf) f X () is not itself a probability, then unlike the pmf of a discrete rv we do not require f X () 1 From (1) it is clear that the pdf of a continuous rv X completely characterises its distribution, so we often just specify f X Eample Suppose we have a continuous random variable X with probability density function { c f () = 2, 0 < < 3 0, otherwise for some unknown constant c Questions 1 Determine c 2 Find the cdf of X 3 Calculate P(1 < X < 2) Solutions 1 To find c: We must have 1 = c = 1 9 f ()d = 3 0 [ c 2 3 d = c 3 ] 3 0 = 9c 4
5 0, < 0 2 F() = f (u)du = u du = 27, 0 3 1, > 3 3 P(1 < X < 2) = F(2) F(1) = = f() pdf F() cdf Transformations Transforming random variables Suppose we have a continuous random variable X and wish to consider the transformed random variable Y = g(x) for some function g : R R, st g is continuous and strictly monotonic (so g 1 eists) Suppose g is monotonic increasing Then for y R, Y y X g 1 (y) So F Y (y) = P Y (Y y) = P X (X g 1 (y)) = F X (g 1 (y)) By the chain rule of differentiation, f Y (y) = F Y (y) = f X{g 1 (y)}g 1 (y) Note g 1 (y) = d dy g 1 (y) is positive since we assumed g was increasing If we had g monotonic decreasing, Y y X g 1 (y) and So by comparison with before, we would have with g 1 (y) always negative So overall, for Y = g(x) we have F Y (y) = P X (X g 1 (y)) = 1 F X (g 1 (y)) f Y (y) = F Y (y) = f X{g 1 (y)}g 1 (y) f Y (y) = f X {g 1 (y)} g 1 (y) (2) 5
6 2 Mean, Variance and Quantiles 21 Epectation E(X) For a continuous random variable X we define the mean or epectation of X, µ X or E X (X) = f X ()d More generally, for a function of interest of the random variable g : R R we have E X {g(x)} = g() f X ()d Linearity of Epectation Clearly, for continuous random variables we again have linearity of epectation E(aX + b) = ae(x) + b, a, b R, and that for two functions g, h : R R, we have additivity of epectation 22 Variance E{g(X) + h(x)} = E{g(X)} + E{h(X)} Var(X) The variance of a continuous random variable X is given by σ 2 X or Var X(X) = E{(X µ X ) 2 } = ( µ X ) 2 f X ()d and again it is easy to show that Var X (X) = 2 f X ()d µ 2 X = E(X 2 ) {E(X)} 2 R For a linear transformation ax + b we again have Var(aX + b) = a 2 Var(X), a, b 6
7 23 Quantiles Q X (α) Recall we defined the lower and upper quartiles and median of a sample of data as points (¼,¾,½)way through the ordered sample as For α [0, 1] and a continuous random variable X we define the αquantile of X, Q X (α), Q X (α) = min q R {q : F X(q) = α} If F X is invertible then Q X (α) = FX 1(α) In particular the median of a random variable X is Q X (05) That is, the median is a solution to the equation F X () = 1 2 Eample (continued) Again suppose we have a continuous random variable X with probability density function given by f () = { 2 /9, 0 < < 3 0, otherwise Questions 1 Calculate E(X) 2 Calculate Var(X) 3 Calculate the median of X Solutions 1 E(X) = 2 E(X 2 ) = f ()d = 2 f ()d = d = d = 45 = = = = 54 So Var(X) = E(X 2 ) {E(X)} 2 = = From earlier, F() = 3, for 0 < < 3 27 Setting F() = 1 2 median and solving, we get 3 27 = 1 2 = = for the 7
8 3 Continuous Distributions 31 Uniform U(a, b) Suppose X is a continuous random variable with probability density function f () = { 1 b a, a < < b 0, otherwise, and hence corresponding cumulative distribution function 0, a a F() = b a, a < < b 1, b Then X is said to follow a uniform distribution on the interval (a, b) and we write X U(a, b) Eample: U(0,1) 1 f () F() 0 1 Notice from the cdf that the quantiles of U(0,1) are the special case where Q(α) = α Relationship between U(a, b) and U(0,1) Suppose X U(0, 1), so F X () =, 0 1 For a < b R, if Y = a + (b a)x then Y U(a, b) 0 1 X a Y b Proof: We first observe that for any y (a, b), Y y a + (b a)x y X y a b a ( From this we find Y U(a, b), since F Y (y) = P(Y y) = P X y a ) b a y a b a = F X ( y a b a ) = 8
9 Mean and Variance of U(a, b) To find the mean of X U(a, b), E(X) = f ()d = b a [ 1 b a d = 2 2(b a) = b2 a 2 (b a)(b + a) = = a + b 2(b a) 2(b a) 2 Similarly we get Var(X) = E(X 2 ) E(X) 2 = (b a)2 12, so 32 Eponential µ = a + b 2, σ2 = (b a)2 12 Ep(λ) Suppose now X is a random variable taking value on R + = [0, ) with pdf for some λ > 0 f () = λe λ, 0, Then X is said to follow an eponential distribution with rate parameter λ and we write X Ep(λ) Straightforward integration between 0 and leads to the cdf, F() = 1 e λ, 0 ] b a The mean and variance are given by µ = 1 λ, σ2 = 1 λ 2 Eample: Ep(1), Ep(05) & Ep(02) pdfs f() λ = 1 λ = 05 λ =
10 Eample: Ep(02), Ep(05) & Ep(1) cdfs F() Lack of Memory Property First notice that from the eponential distribution cdf equation we have P(X > ) = e λ An important (and not always desirable) characteristic of the eponential distribution is the so called lack of memory property For, s > 0, consider the conditional probability P(X > + s X > s) of the additional magnitude of an eponentially distributed random variable given we already know it is greater than s Well P(X > + s) P(X > + s X > s) =, P(X > s) which, when X Ep(λ), gives P(X > + s X > s) = e λ(+s) e λs = e λ, again an eponential distribution with parameter λ So if we think of the eponential variable as the time to an event, then knowledge that we have waited time s for the event tells us nothing about how much longer we will have to wait  the process has no memory Eamples Eponential random variables are often used to model the time until occurrence of a random event where there is an assumed constant risk (λ) of the event happening over time, and so are frequently used as a simplest model, for eample, in reliability analysis So eamples include: the time to failure of a component in a system; the time until we find the net mistake on my slides; the distance we travel along a road until we find the net pothole; the time until the net jobs arrives at a database server; 10
11 Link with Poisson Distribution Notice the duality between some of the eponential rv eamples and those we saw for a Poisson distribution In each case, number of events has been replaced with time between events Claim: If events in a random process occur according to a Poisson distribution with rate λ then the time between events has an eponential distribution with rate parameter λ Proof: Suppose we have some random event process such that > 0, the number of events occurring in [0, ], N, follows a Poisson distribution with rate parameter λ, so N Poi(λ) Such a process is known as an homogeneous Poisson process Let X be the time until the first event of this process arrives Then we notice that P(X > ) P(N = 0) = (λ)0 e λ 0! = e λ and hence X Ep(λ) The same argument applies for all subsequent interarrival times 33 Gaussian N(µ, σ 2 ) Suppose X is a random variable taking value on R with pdf f () = 1 } { σ 2π ep ( µ)2 2σ 2, for some µ R, σ > 0 Then X is said to follow a Gaussian or normal distribution with mean µ and variance σ 2, and we write X N(µ, σ 2 ) The cdf of X N(µ, σ 2 ) is not analytically tractable for any (µ, σ), so we can only write F() = 1 } σ (t µ)2 ep { 2π 2σ 2 dt 11
12 Eample: N(0,1), N(2,1) & N(0,4) pdfs f() N(0,1) N(0,4) N(2,1) Eample: N(0,1), N(2,1) & N(0,4) cdfs F() N(0, 1) Setting µ = 0, σ = 1 and Z N(0, 1) gives the special case of the standard normal, with simplified density f (z) φ(z) = 1 2π e z2 2 Again for the cdf, we can only write F(z) Φ(z) = 1 2π z e t2 2 dt Statistical Tables Since the cdf, and therefore any probabilities, associated with a normal distribution are not analytically available, numerical integration procedures are used to find approimate probabilities 12
13 In particular, statistical tables contain values of the standard normal cdf Φ(z) for a range of values z R, and the quantiles Φ 1 (α) for a range of values α (0, 1) Linear interpolation is used for approimation between the tabulated values But why just tabulate N(0, 1)? We will now see how all normal distribution probabilities can be related back to probabilities from a standard normal distribution Linear Transformations of Normal Random Variables Suppose we have X N(µ, σ 2 ) Then it is also true that for any constants a, b R, the linear combination ax + b also follows a normal distribution More precisely, X N(µ, σ 2 ) ax + b N(aµ + b, a 2 σ 2 ), a, b R (Note that the mean and variance parameters of this transformed distribution follow from the general results for epectation and variance of any random variable under linear transformation) In particular, this allows us to standardise any normal rv, X N(µ, σ 2 ) X µ σ N(0, 1) Standardising Normal Random Variables So if X N(µ, σ 2 ) and we set Z = X µ, then since σ > 0 we can first observe that for σ any R, X X µ µ σ σ Z µ σ Therefore we can write the cdf of X in terms of Φ, ( F X () = P(X ) = P Z µ ) σ ( ) µ = Φ σ Table of Φ z Φ(z) z Φ(z) z Φ(z) z Φ(z)
14 Using Table of Φ First of all notice that Φ(z) has been tabulated for z > 0 This is because the standard normal pdf φ is symmetric about 0, so φ( z) = φ(z) For the cdf Φ, this means Φ(z) = 1 Φ( z) So for eample, Φ( 12) = 1 Φ(12) = 0115 Similarly, if Z N(0, 1) and we want P(Z > z), then for eample P(Z > 15) = 1 P(Z 15) = 1 Φ(15) Important Quantiles of N(0, 1) We will often have cause to use the 975% and 995% quantiles of N(0, 1), given by Φ 1 (0975) and Φ 1 (0995) Φ(196) 975% So with 95% probability an N(0, 1) rv will lie in [ 196, 196] ( [ 2, 2]) Φ(258) = 995% So with 99% probability an N(0, 1) rv will lie in [ 258, 258] More generally, for α (0, 1) and defining z 1 α/2 to be the (1 α/2) quantile of N(0, 1), if Z N(0, 1) then P Z (Z [ z 1 α/2, z 1 α/2 ]) = 1 α More generally still, if X N(µ, σ 2 ), then P X (X [µ σz 1 α/2, µ + σz 1 α/2 ]) = 1 α, and hence [µ σz 1 α/2, µ + σz 1 α/2 ] gives a (1 α) probability region for X centred around µ This can be rewritten as P X ( X µ σz 1 α/2 ) = 1 α Eample An analogue signal received at a detector (measured in microvolts) may be modelled as a Gaussian random variable X N(200, 256) 1 What is the probability that the signal will eceed 240µV? 2 What is the probability that the signal is larger than 240µV given that it is greater than 210µV? Solutions: 1 P(X > 240) = 1 P(X 240) = 1 Φ ( ) = 1 Φ(25)
15 ( ) P(X > 240) 2 P(X > 240 X > 210) = P(X > 210) = 1 Φ ( 256 ) Φ The Central Limit Theorem Let X 1, X 2,, X n be n independent and identically distributed (iid) random variables from any probability distribution, each with mean µ and variance σ 2 ( n ) ( n ) ( n ) From before we know E X i = nµ,var X i = nσ 2 First notice E X i nµ = i=1 i=1 i=1 0, Var ( n X i nµ i=1 ) = nσ 2 Dividing by ( n ) ( nσ, E i=1 X i nµ n ) = 0, Var i=1 X i nµ = nσ nσ 1 But we can now present the following, astonishing result i=1 n lim X i nµ Φ n nσ This can also be written as lim n X µ σ/ n Φ, where X = n i=1 X i n Or finally, for large n we have approimately or X N ) (µ, σ2, n n X i N ( nµ, nσ 2) i=1 We note here that although all these approimate distributional results hold irrespective of the distribution of the {X i }, in the special case where X i N(µ, σ 2 ) these distributional results are, in fact, eact This is because the sum of independent normally distributed random variables is also normally distributed Eample Consider the most simple eample, that X 1, X 2, are iid Bernoulli(p) discrete random variables taking value 0 or 1 Then the {X i } each have mean µ = p and variance σ 2 = p(1 p) By definition, we know that for any n, n X i Binomial(n, p) i=1 15
16 But now, by the Central Limit Theorem (CLT), we also have for large n that approimately So for large n n X i N ( nµ, nσ 2) N(np, np(1 p)) i=1 Binomial(n, p) N(np, np(1 p)) Notice that the LHS is a discrete distribution, and the RHS is a continuous distribution Binomial(10,½) pmf & N(5,25) pdf p() Binomial(10,05) f() N(5,25) Binomial(100,½) pmf & N(50,25) pdf p() Binomial(100,05) f() N(50,25)
17 Binomial(1000,½) pmf & N(500,250) pdf Binomial(1000,05) N(500,250) p() f() So suppose X was the number of heads found on 1000 tosses of a fair coin, and we were interested in P(X 490) Using the binomial distribution pmf, we would need to calculate P(X 490) = p X (0) + p X (1) + p X (2) + + p X (490) (!) ( 027) ( However, using ) the CLT we have approimately X N(500, 250) and so P(X 490) Φ = Φ( 0632) = 1 Φ(0632) LogNormal Distribution Suppose X N(µ, σ 2 ), and consider the transformation Y = e X Then if g() = e, g 1 (y) = log(y) and g 1 (y) = 1 y Then by (2) we have f Y (y) = and we say Y follows a lognormal distribution Eample: LN(0,1), LN(2,1) & LN(0,4) pdfs ] 1 [ σy 2π ep {log(y) µ}2 2σ 2, y > 0, f() LN(0,4) LN(0,1) LN(2,1)
3. Continuous Random Variables
3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such
More informationTopic 2: Scalar random variables. Definition of random variables
Topic 2: Scalar random variables Discrete and continuous random variables Probability distribution and densities (cdf, pmf, pdf) Important random variables Expectation, mean, variance, moments Markov and
More informationNormal distribution Approximating binomial distribution by normal 2.10 Central Limit Theorem
1.1.2 Normal distribution 1.1.3 Approimating binomial distribution by normal 2.1 Central Limit Theorem Prof. Tesler Math 283 October 22, 214 Prof. Tesler 1.1.23, 2.1 Normal distribution Math 283 / October
More informationJointly Distributed Random Variables
Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................
More informationContinuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4  and Cengage
4 Continuous Random Variables and Probability Distributions Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4  and Cengage Continuous r.v. A random variable X is continuous if possible values
More information5. Continuous Random Variables
5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be
More informationRandom Variables and Their Expected Values
Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution
More informationContinuous random variables
Continuous random variables So far we have been concentrating on discrete random variables, whose distributions are not continuous. Now we deal with the socalled continuous random variables. A random
More informationRandom variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.
Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()
More informationMath 431 An Introduction to Probability. Final Exam Solutions
Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <
More informationCSE 312, 2011 Winter, W.L.Ruzzo. 7. continuous random variables
CSE 312, 2011 Winter, W.L.Ruzzo 7. continuous random variables continuous random variables Discrete random variable: takes values in a finite or countable set, e.g. X {1,2,..., 6} with equal probability
More informationRandom variables, probability distributions, binomial random variable
Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that
More informationContinuous Distributions
MAT 2379 3X (Summer 2012) Continuous Distributions Up to now we have been working with discrete random variables whose R X is finite or countable. However we will have to allow for variables that can take
More informationReview. Lecture 3: Probability Distributions. Poisson Distribution. May 8, 2012 GENOME 560, Spring Su In Lee, CSE & GS
Lecture 3: Probability Distributions May 8, 202 GENOME 560, Spring 202 Su In Lee, CSE & GS suinlee@uw.edu Review Random variables Discrete: Probability mass function (pmf) Continuous: Probability density
More informationChapter 3: Discrete Random Variable and Probability Distribution. January 28, 2014
STAT511 Spring 2014 Lecture Notes 1 Chapter 3: Discrete Random Variable and Probability Distribution January 28, 2014 3 Discrete Random Variables Chapter Overview Random Variable (r.v. Definition Discrete
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 101634501 Probability and Statistics for Engineers Winter 20102011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete
More informationProbability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0
Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall
More informationExpectation Discrete RV  weighted average Continuous RV  use integral to take the weighted average
PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV  weighted average Continuous RV  use integral to take the weighted average Variance Variance is the average of (X µ) 2
More informationReview Exam Suppose that number of cars that passes through a certain rural intersection is a Poisson process with an average rate of 3 per day.
Review Exam 2 This is a sample of problems that would be good practice for the exam. This is by no means a guarantee that the problems on the exam will look identical to those on the exam but it should
More informationContinuous Random Variables
Probability 2  Notes 7 Continuous Random Variables Definition. A random variable X is said to be a continuous random variable if there is a function f X (x) (the probability density function or p.d.f.)
More informationPractice Exam 1: Long List 18.05, Spring 2014
Practice Eam : Long List 8.05, Spring 204 Counting and Probability. A full house in poker is a hand where three cards share one rank and two cards share another rank. How many ways are there to get a fullhouse?
More informationAnswers to some even exercises
Answers to some even eercises Problem  P (X = ) = P (white ball chosen) = /8 and P (X = ) = P (red ball chosen) = 7/8 E(X) = (P (X = ) + P (X = ) = /8 + 7/8 = /8 = /9 E(X ) = ( ) (P (X = ) + P (X = )
More informationJoint Continuous Distributions
Joint Continuous Distributions Statistics 11 Summer 6 Copright c 6 b Mark E. Irwin Joint Continuous Distributions Not surprisingl we can look at the joint distribution of or more continuous RVs. For eample,
More informationLecture 6: Discrete & Continuous Probability and Random Variables
Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September
More informationWhat is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference
0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures
More informationStatistiek (WISB361)
Statistiek (WISB361) Final exam June 29, 2015 Schrijf uw naam op elk in te leveren vel. Schrijf ook uw studentnummer op blad 1. The maximum number of points is 100. Points distribution: 23 20 20 20 17
More informationPOL 571: Expectation and Functions of Random Variables
POL 571: Expectation and Functions of Random Variables Kosuke Imai Department of Politics, Princeton University March 10, 2006 1 Expectation and Independence To gain further insights about the behavior
More informationRandom Variables and their Distributions
Chapter 1 Random Variables and their Distributions 1.1 Random Variables Definition 1.1. A random variable X is a function that assigns one and only one numerical value to each outcome of an experiment,
More informationContinuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.
UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Neda Farzinnia, UCLA Statistics University of California,
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationConditional expectation
A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined
More informationOverview of Monte Carlo Simulation, Probability Review and Introduction to Matlab
Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?
More informationContents. TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics. Yuming Jiang. Basic Concepts Random Variables
TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics Yuming Jiang 1 Some figures taken from the web. Contents Basic Concepts Random Variables Discrete Random Variables Continuous Random
More informationContinuous Probability Distributions (120 Points)
Economics : Statistics for Economics Problem Set 5  Suggested Solutions University of Notre Dame Instructor: Julio Garín Spring 22 Continuous Probability Distributions 2 Points. A management consulting
More informationIntroduction to Probability
Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence
More informationStatistics 100A Homework 4 Solutions
Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation
More informationSummary of Formulas and Concepts. Descriptive Statistics (Ch. 14)
Summary of Formulas and Concepts Descriptive Statistics (Ch. 14) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume
More informationWill Landau. Feb 26, 2013
,, and,, and Iowa State University Feb 26, 213 Iowa State University Feb 26, 213 1 / 27 Outline,, and Iowa State University Feb 26, 213 2 / 27 of continuous distributions The pquantile of a random variable,
More informationEngineering 323 Beautiful Homework Set 7 1 of 7 Morelli Problem The cdf of checkout duration X as described in Exercise 1 is
Engineering 33 Beautiful Homework Set 7 of 7 Morelli Problem.. The cdf of checkout duration X as described in Eercise is F( )
More informationEE 322: Probabilistic Methods for Electrical Engineers. Zhengdao Wang Department of Electrical and Computer Engineering Iowa State University
EE 322: Probabilistic Methods for Electrical Engineers Zhengdao Wang Department of Electrical and Computer Engineering Iowa State University Discrete Random Variables 1 Introduction to Random Variables
More informationStatistics  Written Examination MEC Students  BOVISA
Statistics  Written Examination MEC Students  BOVISA Prof.ssa A. Guglielmi 26.0.2 All rights reserved. Legal action will be taken against infringement. Reproduction is prohibited without prior consent.
More informationRandom Variables, Expectation, Distributions
Random Variables, Expectation, Distributions CS 5960/6960: Nonparametric Methods Tom Fletcher January 21, 2009 Review Random Variables Definition A random variable is a function defined on a probability
More informationIntroduction to Random Variables (RVs)
ECE270: Handout 5 Introduction to Random Variables (RVs) Outline: 1. informal definition of a RV, 2. three types of a RV: a discrete RV, a continuous RV, and a mixed RV, 3. a general rule to find probability
More informationChapter 4 Expected Values
Chapter 4 Expected Values 4. The Expected Value of a Random Variables Definition. Let X be a random variable having a pdf f(x). Also, suppose the the following conditions are satisfied: x f(x) converges
More informationLecture Notes 1. Brief Review of Basic Probability
Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters 3 are a review. I will assume you have read and understood Chapters 3. Here is a very
More informationLecture 8: Continuous random variables, expectation and variance
Lecture 8: Continuous random variables, expectation and variance Lejla Batina Institute for Computing and Information Sciences Digital Security Version: spring 2012 Lejla Batina Version: spring 2012 Wiskunde
More informationSome notes on the Poisson distribution
Some notes on the Poisson distribution Ernie Croot October 2, 2008 1 Introduction The Poisson distribution is one of the most important that we will encounter in this course it is right up there with the
More informationsheng@mail.ncyu.edu.tw 1 Content Introduction Expectation and variance of continuous random variables Normal random variables Exponential random variables Other continuous distributions The distribution
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete
More informationRandom Variables of The Discrete Type
Random Variables of The Discrete Type Definition: Given a random experiment with an outcome space S, a function X that assigns to each element s in S one and only one real number x X(s) is called a random
More informationLecture 7: Continuous Random Variables
Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider
More informationThe Poisson process. Chapter Definition of the Poisson process
Chapter 3 The Poisson process The next part of the course deals with some fundamental models of events occurring randomly in continuous time. Many modelling applications involve events ( arrivals ) happening
More informationReview Solutions, Exam 3, Math 338
Review Solutions, Eam 3, Math 338. Define a random sample: A random sample is a collection of random variables, typically indeed as X,..., X n.. Why is the t distribution used in association with sampling?
More informationThe Expected Value of X
3.3 Expected Values The Expected Value of X Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X) or μ X or just μ, is 2 Example
More information( ) is proportional to ( 10 + x)!2. Calculate the
PRACTICE EXAMINATION NUMBER 6. An insurance company eamines its pool of auto insurance customers and gathers the following information: i) All customers insure at least one car. ii) 64 of the customers
More informationWorked examples Multiple Random Variables
Worked eamples Multiple Random Variables Eample Let X and Y be random variables that take on values from the set,, } (a) Find a joint probability mass assignment for which X and Y are independent, and
More informationST 371 (VIII): Theory of Joint Distributions
ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or
More informationExample. Two fair dice are tossed and the two outcomes recorded. As is familiar, we have
Lectures 910 jacques@ucsd.edu 5.1 Random Variables Let (Ω, F, P ) be a probability space. The Borel sets in R are the sets in the smallest σ field on R that contains all countable unions and complements
More informationNormal distribution. ) 2 /2σ. 2π σ
Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a
More informationSection 5.1 Continuous Random Variables: Introduction
Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,
More informationLecture 13: Some Important Continuous Probability Distributions (Part 2)
Lecture 13: Some Important Continuous Probability Distributions (Part 2) Kateřina Staňková Statistics (MAT1003) May 14, 2012 Outline 1 Erlang distribution Formulation Application of Erlang distribution
More informationProperties of Expected values and Variance
Properties of Expected values and Variance Christopher Croke University of Pennsylvania Math 115 UPenn, Fall 2011 Expected value Consider a random variable Y = r(x ) for some function r, e.g. Y = X 2 +
More information4. Continuous Random Variables, the Pareto and Normal Distributions
4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random
More informationLesson 5 Chapter 4: Jointly Distributed Random Variables
Lesson 5 Chapter 4: Jointly Distributed Random Variables Department of Statistics The Pennsylvania State University 1 Marginal and Conditional Probability Mass Functions The Regression Function Independence
More informationST 371 (IV): Discrete Random Variables
ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible
More informationSummary of Probability
Summary of Probability Mathematical Physics I Rules of Probability The probability of an event is called P(A), which is a positive number less than or equal to 1. The total probability for all possible
More informationRenewal Theory. (iv) For s < t, N(t) N(s) equals the number of events in (s, t].
Renewal Theory Def. A stochastic process {N(t), t 0} is said to be a counting process if N(t) represents the total number of events that have occurred up to time t. X 1, X 2,... times between the events
More informatione.g. arrival of a customer to a service station or breakdown of a component in some system.
Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be
More informationDefinition: Suppose that two random variables, either continuous or discrete, X and Y have joint density
HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,
More informationJoint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage
5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single
More informationLecture 10: Other Continuous Distributions and Probability Plots
Lecture 10: Other Continuous Distributions and Probability Plots Devore: Section 4.44.6 Page 1 Gamma Distribution Gamma function is a natural extension of the factorial For any α > 0, Γ(α) = 0 x α 1 e
More information9740/02 October/November MATHEMATICS (H2) Paper 2 Suggested Solutions. Ensure calculator s mode is in Radians. (iii) Refer to part (i)
GCE Level October/November 8 Suggested Solutions Mathematics H (974/) version. MTHEMTICS (H) Paper Suggested Solutions. Topic : Functions and Graphs (i) 974/ October/November 8 (iii) (iv) f(x) g(x)
More informationTransformations and Expectations of random variables
Transformations and Epectations of random variables X F X (): a random variable X distributed with CDF F X. Any function Y = g(x) is also a random variable. If both X, and Y are continuous random variables,
More informationThe Delta Method and Applications
Chapter 5 The Delta Method and Applications 5.1 Linear approximations of functions In the simplest form of the central limit theorem, Theorem 4.18, we consider a sequence X 1, X,... of independent and
More information0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) =
. A mailorder computer business has si telephone lines. Let X denote the number of lines in use at a specified time. Suppose the pmf of X is as given in the accompanying table. 0 2 3 4 5 6 p(.0.5.20.25.20.06.04
More informationPractice problems for Homework 11  Point Estimation
Practice problems for Homework 11  Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:
More informationSolution Using the geometric series a/(1 r) = x=1. x=1. Problem For each of the following distributions, compute
Math 472 Homework Assignment 1 Problem 1.9.2. Let p(x) 1/2 x, x 1, 2, 3,..., zero elsewhere, be the pmf of the random variable X. Find the mgf, the mean, and the variance of X. Solution 1.9.2. Using the
More informationthe number of organisms in the squares of a haemocytometer? the number of goals scored by a football team in a match?
Poisson Random Variables (Rees: 6.8 6.14) Examples: What is the distribution of: the number of organisms in the squares of a haemocytometer? the number of hits on a web site in one hour? the number of
More informationIntroduction to Probability
Motoya Machida January 7, 6 This material is designed to provide a foundation in the mathematics of probability. We begin with the basic concepts of probability, such as events, random variables, independence,
More informationProbability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X
Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chisquared distributions, normal approx'n to the binomial Uniform [,1] random
More informationMATH 56A SPRING 2008 STOCHASTIC PROCESSES 31
MATH 56A SPRING 2008 STOCHASTIC PROCESSES 3.3. Invariant probability distribution. Definition.4. A probability distribution is a function π : S [0, ] from the set of states S to the closed unit interval
More informationContinuous Distributions, Mainly the Normal Distribution
Continuous Distributions, Mainly the Normal Distribution 1 Continuous Random Variables STA 281 Fall 2011 Discrete distributions place probability on specific numbers. A Bin(n,p) distribution, for example,
More information1 Definition of Conditional Expectation
1 DEFINITION OF CONDITIONAL EXPECTATION 4 1 Definition of Conditional Expectation 1.1 General definition Recall the definition of conditional probability associated with Bayes Rule P(A B) P(A B) P(B) For
More informationDiscrete and Continuous Random Variables. Summer 2003
Discrete and Continuous Random Variables Summer 003 Random Variables A random variable is a rule that assigns a numerical value to each possible outcome of a probabilistic experiment. We denote a random
More informationSTAT/MTHE 353: Probability II. STAT/MTHE 353: Multiple Random Variables. Review. Administrative details. Instructor: TamasLinder
STAT/MTHE 353: Probability II STAT/MTHE 353: Multiple Random Variables Administrative details Instructor: TamasLinder Email: linder@mast.queensu.ca T. Linder ueen s University Winter 2012 O ce: Je ery
More informationProbabilities and Random Variables
Probabilities and Random Variables This is an elementary overview of the basic concepts of probability theory. 1 The Probability Space The purpose of probability theory is to model random experiments so
More informationChapter 6 MORE ABOUT RANDOM VARIABLES. Mean and variance of a random variable...
This work is licensed under the Creative commons AttributionNoncommercialShare Alike 2.5 South Africa License. You are free to copy, communicate and adapt the work on condition that you attribute the
More informationSet theory, and set operations
1 Set theory, and set operations Sayan Mukherjee Motivation It goes without saying that a Bayesian statistician should know probability theory in depth to model. Measure theory is not needed unless we
More information3.6: General Hypothesis Tests
3.6: General Hypothesis Tests The χ 2 goodness of fit tests which we introduced in the previous section were an example of a hypothesis test. In this section we now consider hypothesis tests more generally.
More informationProbability and Statistics
CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b  0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute  Systems and Modeling GIGA  Bioinformatics ULg kristel.vansteen@ulg.ac.be
More informationFor a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )
Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (19031987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll
More informationChange of Continuous Random Variable
Change of Continuous Random Variable All you are responsible for from this lecture is how to implement the Engineer s Way (see page 4) to compute how the probability density function changes when we make
More informationMATH 201. Final ANSWERS August 12, 2016
MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6sided dice, five 8sided dice, and six 0sided dice. A die is drawn from the bag and then rolled.
More informationLecture 16: Expected value, variance, independence and Chebyshev inequality
Lecture 16: Expected value, variance, independence and Chebyshev inequality Expected value, variance, and Chebyshev inequality. If X is a random variable recall that the expected value of X, E[X] is the
More informationNotes on the second moment method, Erdős multiplication tables
Notes on the second moment method, Erdős multiplication tables January 25, 20 Erdős multiplication table theorem Suppose we form the N N multiplication table, containing all the N 2 products ab, where
More informationMath 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304. jones/courses/141
Math 141 Lecture 7: Variance, Covariance, and Sums Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Last Time Variance: expected squared deviation from the mean: Standard
More informationProbability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur
Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special DistributionsVI Today, I am going to introduce
More informationLecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. LogNormal Distribution
Lecture 8: Random Walk vs. Brownian Motion, Binomial Model vs. Logormal Distribution October 4, 200 Limiting Distribution of the Scaled Random Walk Recall that we defined a scaled simple random walk last
More informationL10: Probability, statistics, and estimation theory
L10: Probability, statistics, and estimation theory Review of probability theory Bayes theorem Statistics and the Normal distribution Least Squares Error estimation Maximum Likelihood estimation Bayesian
More informationCHAPTER 3. Statistics
CHAPTER 3 Statistics Statistics has many aspects. These include: data gathering data analysis statistical inference. For instance, politicians may wish to see whether a new law will be popular. They as
More information