# Common probability distributionsi Math 217/218 Probability and Statistics Prof. D. Joyce, 2016

Save this PDF as:

Size: px
Start display at page:

Download "Common probability distributionsi Math 217/218 Probability and Statistics Prof. D. Joyce, 2016"

## Transcription

1 Introduction. ommon probability distributionsi Math 7/8 Probability and Statistics Prof. D. Joyce, 06 I summarize here some of the more common distributions used in probability and statistics. Some are more important than others, and not all of them are used in all fields. I ve identified four sources of these distributions, although there are more than these. the uniform distributions, either discrete Uniform(n, or continuous Uniform(a, b. those that come from the Bernoulli process or sampling with or without replacement: Bernoulli(p, Binomial(n, p, Geometric(p, NegativeBinomial(p, r, and Hypergeometric(N, M, n. those that come from the Poisson process: Poisson(λ, t, Exponential(λ, Gamma(λ, r, and Beta(α, β. those related to the entral Limit Theorem: Normal(µ, σ, hisquared(ν, T(ν, and F(ν, ν. For each distribution, I give the name of the distribution along with one or two parameters and indicate whether it is a discrete distribution or a continuous one. Then I describe an example interpretation for a random variable X having that distribution. Each discrete distribution is determined by a probability mass function f which gives the probabilities for the various outcomes, so that f(x = P (X=x, the probability that a random variable X with that distribution takes on the value x. Each continuous distribution is determined by a probability density function f, which, when integrated from a to b gives you the probability P (a X b. Next, I list the mean µ = E(X and variance σ = E((X µ = E(X µ for the distribution, and for most of the distributions I include the moment generating function m(t = E(X t. Finally, I indicate how some of the distributions may be used.. Background on gamma and beta functions. Some of the functions below are described in terms of the gamma and beta functions. These are, ( in some sense, continuous versions of the factorial function n! and binomial coefficients n, as described below. k

2 The gamma function, Γ(x, is defined for any real number x, except for 0 and negative integers, by the integral Γ(x = 0 t x e t dt. The Γ function generalizes the factorial function in the sense that when x is a positive integer, Γ(x = (x!. This follows from the recursion formula, Γ(x + = xγ(x, and the fact that Γ( =, both of which can be easily proved by methods of calculus. The values of the Γ function at half integers are important for some of the distributions mentioned below. A bit of clever work shows that Γ( = π, so by the recursion formula, Γ ( n + = (n (n 3 3 n π. The beta function, B(α, β, is a function of two positive real arguments. Like the Γ function, it is defined by an integral, but it can also be defined in terms of the Γ function B(α, β = 0 t α ( t β dt = Γ(αΓ(β Γ(α + β. The beta function is related to binomial coefficients, for when α and β are positive integers, B(α, β = (α!(β!. (α + β! The uniform distributions Uniform distributions come in two kinds, discrete and continuous. They share the property that all possible values are equally likely.. Discrete uniform distribution. Uniform(n. Discrete. In general, a discrete uniform random variable X can take any finite set as values, but here I only consider the case when X takes on integer values from to n, where the parameter n is a positive integer. Each value has the same probability, namely /n. Uniform(n f(x = /n, for x =,,..., n µ = n +. σ = n m(t = e(n+t e t Example. A typical example is tossing a fair cubic die. n = 6, µ = 3.5, and σ = 35.

4 3. Bernoulli distribution. Bernoulli(p. Discrete. The parameter p is a real number between 0 and. The random variable X takes on two values: (success or 0 (failure. The probability of success, P (X=, is the parameter p. The symbol q is often used for p, the probability of failure, P (X=0. Bernoulli(p f(0 = p, f( = p µ = p. σ = p( p = pq m(t = pe t + q Example. If you flip a fair coin, then p =, µ = p =, and σ = 4. Maximum likelihood estimator. Given a sample x from a Bernoulli distribution with unknown p, the maximum likelihood estimator for p is x, the number of successes divided by n the number of trials. 3. Binomial distribution. Binomial(n, p. Discrete. Here, X is the sum of n independent Bernoulli trials, each Bernoulli(p, so X = x means there were x successes among the n trials. (Note that the Bernoulli(p = Binomial(, p. ( Binomial(n, p n f(x = p x ( p n x, for x = 0,,..., n x µ = np. σ = np( p = npq m(t = (pe t + q n Maximum likelihood estimator number of trials. for p is x, the number of successes divided by n the Sampling with replacement. Sampling with replacement occurs when a set of N elements has a subset of M preferred elements, and n elements are chosen at random, but the n elements don t have to be distinct. In other words, after one element is chosen, it s put back in the set so it can be chosen again. Selecting a preferred element is success, and that happens with probability p = M/N, while selecting a nonpreferred element is failure, and that happens with probability q = p = (N M/N. Thus, sampling with replacement is a Bernoulli process. A bit of history. The first serious development in the theory of probability was in the 650s when Pascal and Fermat investigated the binomial distribution in the special case p =. Pascal published the resulting theory of binomial coefficients and properties of what we now call Pascal s triangle. In the very early 700s Jacob Bernoulli extended these results to general values of p. 4

5 3.3 Geometric distribution. Geometric(p. Discrete. When independent Bernoulli trials are repeated, each with probability p of success, the number of trials X it takes to get the first success has a geometric distribution. Geometric(p f(x = q x p, for x =,,... µ = p. σ = p p m(t = pet qe t 3.4 Negative binomial distribution. NegativeBinomial(p, r. Discrete. When independent Bernoulli trials are repeated, each with probability p of success, and X is the trial number when r successes are first achieved, then X has a negative binomial distribution. Note that Geometric(p = NegativeBinomial(p,. ( NegativeBinomial(p, r x f(x = p r q x r, for x = r, r +,... r µ = r p. σ = rq ( p pe t r m(t = qe t 3.5 Hypergeometric distribution. Hypergeometric(N, M, n. Discrete. In a Bernoulli process, given that there are M successes among N trials, the number X of successes among the first n trials has a Hypergeometric(N, M, n distribution. Hypergeometric(N, ( M, n M ( N M x n x f(x = ( N, for x = 0,,..., n n ( N n µ = np. σ = npq N Sampling without replacement. Sampling with replacement was mentioned above in the section on the binomial distribution. Sampling without replacement is similar, but once an element is selected from a set, it is taken out of the set so that it can t be selected again. Formally, we have a set of N elements with a subset of M preferred elements, and n distinct elements among the N elements are to be chosen at random. The symbol p is often used for the probability of choosing a preferred element, that is, p = M/N, and q = p. The 5

7 4. Exponential distribution. Exponential(λ. ontinuous. When events occur uniformly at random over time at a rate of λ events per unit time, then the random variable X giving the time to the first event has an exponential distribution. 4.3 Gamma distribution. Exponential(λ f(x = λe λx, for x [0, µ = /λ. σ = /λ m(t = ( t/λ Gamma(λ, r or Gamma(α, β. ontinuous. In the same Poisson process for the exponential distribution, the gamma distribution gives the time to the r th event. Thus, Exponential(λ = Gamma(λ,. The gamma distribution also has applications when r is not an integer. For that generality the factorial function is replaced by the gamma function, Γ(x, described above. There is an alternate parameterization Gamma(α, β of the family of gamma distributions. The connection is α = r, and β = /λ which is the expected time to the first event in a Poisson process. Gamma(λ, r f(x = Γ(r λr x r e λx = xα e x/β, for x [0, β α Γ(α µ = r/λ = αβ. σ = r/λ = αβ m(t = ( t/λ r = ( βt α Application to Bayesian statistics. Gamma distributions are used in Bayesian statistics as conjugate priors for the distributions in the Poisson process. In Gamma(α, β, α counts the number of occurrences observe while β keeps track of the elapsed time. 4.4 Beta distribution. Beta(α, β. ontinuous. In a Poisson process, if α + β events occur in a time interval, then the fraction of that interval until the α th event occurs has a Beta(α, β distribution. Note that Beta(, = Uniform(0,. In the following formula, B(α, β is the beta function described above. µ = Beta(α, β f(x = B(α, β xα ( x β, for x [0, ] αβ (α + β (α + β +. αβ σ = (α + β (α + β + 7

8 Application to Bayesian statistics. Beta distributions are used in Bayesian statistics as conjugate priors for the distributions in the Bernoulli process. Indeed, that s just what Thomas Bayes (70 76 did. In Beta(α, β, α counts the number of successes observed while β keeps track of the failures observed. 5 Distributions related to the central limit theorem The entral Limit Theorem says sample means and sample sums approach normal distributions as the sample size approaches infinity. 5. Normal distribution. Normal(µ, σ. ontinuous. The normal distribution, also called the Gaussian distribution, is ubiquitous in probability and statistics. The parameters µ and σ are real numbers, σ being positive with positive square root σ. Normal(µ, σ (x µ ( σ µ = µ. σ = σ m(t = exp(µt + t σ / f(x = σ π exp, for x R The standard normal distribution has µ = 0 and σ =. Thus, its density function is f(x = π e x /, and its moment generating function is m(t = e t /. Maximum likelihood estimators. Given a sample x from a normal distribution with unknown mean µ and variance σ, the maximum likelihood estimator for µ is the sample mean x, and the maximum likelihood estimator for σ is the sample variance n (x i x. n (Notice that the denominator is n, not n for the maximum likelihood estimator. Applications. Normal distributions are used in statistics to make inferences about the population mean when the sample size n is large. Application to Bayesian statistics. Normal distributions are used in Bayesian statistics as conjugate priors for the family of normal distributions with a known variance. 5. χ -distribution. hisquared(ν. ontinuous. The parameter ν, the number of degrees of freedom, is a positive integer. I generally use this Greek letter nu for degrees of freedom. This is the distribution for the sum of the squares of ν independent standard normal distributions. 8 i=

9 A χ -distribution is actually a special case of a gamma distribution with a fractional value for r. hisquared(ν = Gamma(λ, r where λ = and r = ν. hisquared(ν f(x = xν/ e x/ ν/ Γ(ν/, for x 0 µ = ν. σ = ν m(t = ( t ν/ Applications. χ -distributions are used in statistics to make inferences on the population variance when the population is assumed to be normally distributed. 5.3 Student s T -distribution. T(ν. ontinuous. If Y is Normal(0, and Z is hisquared(ν and independent of Y, then X = Y/ Z/ν has a T -distribution with ν degrees of freedom. T(ν Γ( ν+ f(x = πν Γ( ν ( +, for x R x /ν(ν+/ µ = 0. σ = ν ν, when ν > Applications. T -distributions are used in statistics to make inferences on the population variance when the population is assumed to be normally distributed, especially when the population is small. 5.4 Snedecor-Fisher s F -distribution. F(ν, ν. ontinuous. If Y and Z are independent χ -random variables with ν and ν degrees of freedom, respectively, then X = Y/ν Z/ν has an F -distribution with (ν, ν degrees of freedom. Note that if X is T(ν, then X is F(, ν. F(ν, ν ( ν ν f(x = ν/ x ν / B( ν, ν ( + ν ν x, for x > 0 (ν +ν / µ = ν ν, when ν >. σ = ν (ν + ν ν (ν (ν 4, when ν > 4 F -distributions are used in statistics when comparing variances of two pop- Applications. ulations. 9

10 Table of Discrete and ontinuous distributions Distribution Type Mass/density function f(x Mean µ Variance σ Uniform(n D /n, for x =,,..., n (n + / (n / Uniform(a, b a + b (b a, for x [a, b] b a Bernoulli(p D f(0 ( = p, f( = p p p( p Binomial(n, p D n x p x ( p n x, np npq for x = 0,,..., n Geometric(p D q ( x p, for x =,,... /p ( p/p NegativeBinomial(p,r D x r p r q x r, r/p r( p/p ( for x = r, r +,... M ( N M x n x Hypergeometric(N,M,n D ( N, np np( p n for x = 0,,..., n Poisson(λt D x! (λtx e λt, for x = 0,,... λt λt Exponential(λ λe λx, for x [0, /λ /λ Gamma(λ, r Γ(r λr x r e λx r/λ r/λ Gamma(α, β Beta(α, β Normal(µ, σ hisquared(ν T(ν F(ν, ν = xα e x/β, β α Γ(α = αβ = aβ for x [0, B(α, β xα ( x β α, α + β for 0 x ( σ π exp (x µ, σ for x R µ σ x ν/ e x/, for x 0 ν/ Γ(ν/ ν ν Γ( ν+ αβ (α+β (α+β+ πν Γ( ν ( + 0 ν/(ν x /ν (ν+/ for x R B( ν, ν for > 0 ( ν ν ν / x ν / ( + ν ν x (ν +ν / Math 7 Home Page at Math 8 Home Page at ν ν ν (ν +ν ν (ν (ν 4 0

### 3. Continuous Random Variables

3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such

### 1 A simple example. A short introduction to Bayesian statistics, part I Math 218, Mathematical Statistics D Joyce, Spring 2016

and P (B X). In order to do find those conditional probabilities, we ll use Bayes formula. We can easily compute the reverse probabilities A short introduction to Bayesian statistics, part I Math 18, Mathematical

### Random Variables of The Discrete Type

Random Variables of The Discrete Type Definition: Given a random experiment with an outcome space S, a function X that assigns to each element s in S one and only one real number x X(s) is called a random

### Sufficient Statistics and Exponential Family. 1 Statistics and Sufficient Statistics. Math 541: Statistical Theory II. Lecturer: Songfeng Zheng

Math 541: Statistical Theory II Lecturer: Songfeng Zheng Sufficient Statistics and Exponential Family 1 Statistics and Sufficient Statistics Suppose we have a random sample X 1,, X n taken from a distribution

### Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.

UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Neda Farzinnia, UCLA Statistics University of California,

### Some continuous and discrete distributions

Some continuous and discrete distributions Table of contents I. Continuous distributions and transformation rules. A. Standard uniform distribution U[0, 1]. B. Uniform distribution U[a, b]. C. Standard

### Summary of Probability

Summary of Probability Mathematical Physics I Rules of Probability The probability of an event is called P(A), which is a positive number less than or equal to 1. The total probability for all possible

### Contents. TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics. Yuming Jiang. Basic Concepts Random Variables

TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics Yuming Jiang 1 Some figures taken from the web. Contents Basic Concepts Random Variables Discrete Random Variables Continuous Random

### Random Variables and Their Expected Values

Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution

### m (t) = e nt m Y ( t) = e nt (pe t + q) n = (pe t e t + qe t ) n = (qe t + p) n

1. For a discrete random variable Y, prove that E[aY + b] = ae[y] + b and V(aY + b) = a 2 V(Y). Solution: E[aY + b] = E[aY] + E[b] = ae[y] + b where each step follows from a theorem on expected value from

### Chapter 2. Random variables. 2.1 Fundamentals. Motivation 2.1 Inverse image of sets (Karr, 1993, p. 43)

Chapter 2 Random variables 2.1 Fundamentals Motivation 2.1 Inverse image of sets (Karr, 1993, p. 43) Before we introduce the concept of random variable (r.v.) extensively on inverse images of sets and

### Continuous random variables

Continuous random variables So far we have been concentrating on discrete random variables, whose distributions are not continuous. Now we deal with the so-called continuous random variables. A random

### Topic 2: Scalar random variables. Definition of random variables

Topic 2: Scalar random variables Discrete and continuous random variables Probability distribution and densities (cdf, pmf, pdf) Important random variables Expectation, mean, variance, moments Markov and

### Solution Using the geometric series a/(1 r) = x=1. x=1. Problem For each of the following distributions, compute

Math 472 Homework Assignment 1 Problem 1.9.2. Let p(x) 1/2 x, x 1, 2, 3,..., zero elsewhere, be the pmf of the random variable X. Find the mgf, the mean, and the variance of X. Solution 1.9.2. Using the

### Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage

4 Continuous Random Variables and Probability Distributions Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage Continuous r.v. A random variable X is continuous if possible values

### Discrete Binary Distributions

Discrete Binary Distributions Carl Edward Rasmussen November th, 26 Carl Edward Rasmussen Discrete Binary Distributions November th, 26 / 4 Key concepts Bernoulli: probabilities over binary variables Binomial:

### Examination 110 Probability and Statistics Examination

Examination 0 Probability and Statistics Examination Sample Examination Questions The Probability and Statistics Examination consists of 5 multiple-choice test questions. The test is a three-hour examination

### Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

### Normal distribution Approximating binomial distribution by normal 2.10 Central Limit Theorem

1.1.2 Normal distribution 1.1.3 Approimating binomial distribution by normal 2.1 Central Limit Theorem Prof. Tesler Math 283 October 22, 214 Prof. Tesler 1.1.2-3, 2.1 Normal distribution Math 283 / October

### POL 571: Expectation and Functions of Random Variables

POL 571: Expectation and Functions of Random Variables Kosuke Imai Department of Politics, Princeton University March 10, 2006 1 Expectation and Independence To gain further insights about the behavior

### MATH 201. Final ANSWERS August 12, 2016

MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6-sided dice, five 8-sided dice, and six 0-sided dice. A die is drawn from the bag and then rolled.

### What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

### Stat260: Bayesian Modeling and Inference Lecture Date: February 1, Lecture 3

Stat26: Bayesian Modeling and Inference Lecture Date: February 1, 21 Lecture 3 Lecturer: Michael I. Jordan Scribe: Joshua G. Schraiber 1 Decision theory Recall that decision theory provides a quantification

### Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

### Fisher Information and Cramér-Rao Bound. 1 Fisher Information. Math 541: Statistical Theory II. Instructor: Songfeng Zheng

Math 54: Statistical Theory II Fisher Information Cramér-Rao Bound Instructor: Songfeng Zheng In the parameter estimation problems, we obtain information about the parameter from a sample of data coming

### Lecture 10: Other Continuous Distributions and Probability Plots

Lecture 10: Other Continuous Distributions and Probability Plots Devore: Section 4.4-4.6 Page 1 Gamma Distribution Gamma function is a natural extension of the factorial For any α > 0, Γ(α) = 0 x α 1 e

### Answers to some even exercises

Answers to some even eercises Problem - P (X = ) = P (white ball chosen) = /8 and P (X = ) = P (red ball chosen) = 7/8 E(X) = (P (X = ) + P (X = ) = /8 + 7/8 = /8 = /9 E(X ) = ( ) (P (X = ) + P (X = )

### Introduction to Probability

Motoya Machida January 7, 6 This material is designed to provide a foundation in the mathematics of probability. We begin with the basic concepts of probability, such as events, random variables, independence,

### Lecture 6: Discrete Distributions

Lecture 6: Discrete Distributions 4F3: Machine Learning Joaquin Quiñonero-Candela and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f3/ Quiñonero-Candela

### Chapter 1. Probability, Random Variables and Expectations. 1.1 Axiomatic Probability

Chapter 1 Probability, Random Variables and Expectations Note: The primary reference for these notes is Mittelhammer (1999. Other treatments of probability theory include Gallant (1997, Casella & Berger

### Bivariate Unit Normal. Lecture 19: Joint Distributions. Bivariate Unit Normal - Normalizing Constant. Bivariate Unit Normal, cont.

Lecture 19: Joint Distributions Statistics 14 If X and Y have independent unit normal distributions then their joint distribution f (x, y) is given by f (x, y) φ(x)φ(y) c e 1 (x +y ) Colin Rundel April,

### Introduction to Simulation Using MATLAB

Chapter 12 Introduction to Simulation Using MATLAB A. Rakhshan and H. Pishro-Nik 12.1 Analysis versus Computer Simulation A computer simulation is a computer program which attempts to represent the real

### Probabilities and Random Variables

Probabilities and Random Variables This is an elementary overview of the basic concepts of probability theory. 1 The Probability Space The purpose of probability theory is to model random experiments so

### Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 3-7, 3-8 The remaining discrete random

### Chi-Square Distribution

Chi-Square Distribution James W. Stoutenborough and Paul E. Johnson June 10, 2013 1 Introduction The Chi-Square distribution is a staple of statistical analysis. It is

### Chapter 3: Discrete Random Variable and Probability Distribution. January 28, 2014

STAT511 Spring 2014 Lecture Notes 1 Chapter 3: Discrete Random Variable and Probability Distribution January 28, 2014 3 Discrete Random Variables Chapter Overview Random Variable (r.v. Definition Discrete

### Chapter 5. Random variables

Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

### Lecture.7 Poisson Distributions - properties, Normal Distributions- properties. Theoretical Distributions. Discrete distribution

Lecture.7 Poisson Distributions - properties, Normal Distributions- properties Theoretical distributions are Theoretical Distributions 1. Binomial distribution 2. Poisson distribution Discrete distribution

### Senior Secondary Australian Curriculum

Senior Secondary Australian Curriculum Mathematical Methods Glossary Unit 1 Functions and graphs Asymptote A line is an asymptote to a curve if the distance between the line and the curve approaches zero

### Random Variables and their Distributions

Chapter 1 Random Variables and their Distributions 1.1 Random Variables Definition 1.1. A random variable X is a function that assigns one and only one numerical value to each outcome of an experiment,

### Homework n o 7 Math 505a

Homework n o 7 Math 505a Two players (player A and player B) play a board game. The rule is the following: both player start at position 0. Then, at every round, a dice is thrown: If the result is different

### Chapter 4 Expected Values

Chapter 4 Expected Values 4. The Expected Value of a Random Variables Definition. Let X be a random variable having a pdf f(x). Also, suppose the the following conditions are satisfied: x f(x) converges

### Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

### MATH4427 Notebook 3 Spring MATH4427 Notebook Hypotheses: Definitions and Examples... 3

MATH4427 Notebook 3 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MATH4427 Notebook 3 3 3.1 Hypotheses: Definitions and Examples............................

### Continuous Random Variables

Continuous Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Continuous Random Variables 2 11 Introduction 2 12 Probability Density Functions 3 13 Transformations 5 2 Mean, Variance and Quantiles

### Lecture 13: Some Important Continuous Probability Distributions (Part 2)

Lecture 13: Some Important Continuous Probability Distributions (Part 2) Kateřina Staňková Statistics (MAT1003) May 14, 2012 Outline 1 Erlang distribution Formulation Application of Erlang distribution

### Random Variables. Consider a probability model (Ω, P ). Discrete Random Variables Chs. 2, 3, 4. Definition. A random variable is a function

Rom Variables Discrete Rom Variables Chs.,, 4 Rom Variables Probability Mass Functions Expectation: The Mean Variance Special Distributions Hypergeometric Binomial Poisson Joint Distributions Independence

### Renewal Theory. (iv) For s < t, N(t) N(s) equals the number of events in (s, t].

Renewal Theory Def. A stochastic process {N(t), t 0} is said to be a counting process if N(t) represents the total number of events that have occurred up to time t. X 1, X 2,... times between the events

### Continuous Random Variables

Probability 2 - Notes 7 Continuous Random Variables Definition. A random variable X is said to be a continuous random variable if there is a function f X (x) (the probability density function or p.d.f.)

### Math 141. Lecture 5: Expected Value. Albyn Jones 1. jones/courses/ Library 304. Albyn Jones Math 141

Math 141 Lecture 5: Expected Value Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 History The early history of probability theory is intimately related to questions arising

### Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall

### Prof. Tesler. Math 186 and 283 Winter Prof. Tesler Poisson & Exponential Distributions Math 186 / Winter / 31

Math 186: 4.2 Poisson Distribution: Counting Crossovers in Meiosis 4.2 Exponential and 4.6 Gamma Distributions: Distance Between Crossovers Math 283: Ewens & Grant 1.3.7, 4.1-4.2 Prof. Tesler Math 186

### Notes on the Negative Binomial Distribution

Notes on the Negative Binomial Distribution John D. Cook October 28, 2009 Abstract These notes give several properties of the negative binomial distribution. 1. Parameterizations 2. The connection between

### Review. Lecture 3: Probability Distributions. Poisson Distribution. May 8, 2012 GENOME 560, Spring Su In Lee, CSE & GS

Lecture 3: Probability Distributions May 8, 202 GENOME 560, Spring 202 Su In Lee, CSE & GS suinlee@uw.edu Review Random variables Discrete: Probability mass function (pmf) Continuous: Probability density

### WHERE DOES THE 10% CONDITION COME FROM?

1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

### Lecture 10: Characteristic Functions

Lecture 0: Characteristic Functions. Definition of characteristic functions. Complex random variables.2 Definition and basic properties of characteristic functions.3 Examples.4 Inversion formulas 2. Applications

### 4. Poisson Processes

4. Poisson Processes 4.1 Definition 4.2 Derivation of exponential distribution 4.3 Properties of exponential distribution a. Normalized spacings b. Campbell s Theorem c. Minimum of several exponential

### Probability, Random Variables and Expectations

Chapter Probability, Random Variables and Expectations Note: The primary reference for these notes is Mittelhammer (999). Other treatments of probability theory include Gallant (997), Casella & Berger

### 3.6: General Hypothesis Tests

3.6: General Hypothesis Tests The χ 2 goodness of fit tests which we introduced in the previous section were an example of a hypothesis test. In this section we now consider hypothesis tests more generally.

### 2WB05 Simulation Lecture 8: Generating random variables

2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating

### University of California, Berkeley, Statistics 134: Concepts of Probability

University of California, Berkeley, Statistics 134: Concepts of Probability Michael Lugo, Spring 211 Exam 2 solutions 1. A fair twenty-sided die has its faces labeled 1, 2, 3,..., 2. The die is rolled

### 4. Joint Distributions

Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose

### 6 PROBABILITY GENERATING FUNCTIONS

6 PROBABILITY GENERATING FUNCTIONS Certain derivations presented in this course have been somewhat heavy on algebra. For example, determining the expectation of the Binomial distribution (page 5.1 turned

### Math 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304. jones/courses/141

Math 141 Lecture 7: Variance, Covariance, and Sums Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Last Time Variance: expected squared deviation from the mean: Standard

### Introduction to Series and Sequences Math 121 Calculus II D Joyce, Spring 2013

Introduction to Series and Sequences Math Calculus II D Joyce, Spring 03 The goal. The main purpose of our study of series and sequences is to understand power series. A power series is like a polynomial

### Introduction to Probability

Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

### Section 7.2 Confidence Intervals for Population Proportions

Section 7.2 Confidence Intervals for Population Proportions 2012 Pearson Education, Inc. All rights reserved. 1 of 83 Section 7.2 Objectives Find a point estimate for the population proportion Construct

### Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce

### Some notes on the Poisson distribution

Some notes on the Poisson distribution Ernie Croot October 2, 2008 1 Introduction The Poisson distribution is one of the most important that we will encounter in this course it is right up there with the

### FALL 2005 EXAM C SOLUTIONS

FALL 005 EXAM C SOLUTIONS Question #1 Key: D S ˆ(300) = 3/10 (there are three observations greater than 300) H ˆ (300) = ln[ S ˆ (300)] = ln(0.3) = 1.0. Question # EX ( λ) = VarX ( λ) = λ µ = v = E( λ)

### PROBABILITIES AND PROBABILITY DISTRIBUTIONS

Published in "Random Walks in Biology", 1983, Princeton University Press PROBABILITIES AND PROBABILITY DISTRIBUTIONS Howard C. Berg Table of Contents PROBABILITIES PROBABILITY DISTRIBUTIONS THE BINOMIAL

### Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

### Statistical Intervals. Chapter 7 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

7 Statistical Intervals Chapter 7 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Confidence Intervals The CLT tells us that as the sample size n increases, the sample mean X is close to

### Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random

### Generalized Linear Model Theory

Appendix B Generalized Linear Model Theory We describe the generalized linear model as formulated by Nelder and Wedderburn (1972), and discuss estimation of the parameters and tests of hypotheses. B.1

### The t Test. April 3-8, exp (2πσ 2 ) 2σ 2. µ ln L(µ, σ2 x) = 1 σ 2. i=1. 2σ (σ 2 ) 2. ˆσ 2 0 = 1 n. (x i µ 0 ) 2 = i=1

The t Test April 3-8, 008 Again, we begin with independent normal observations X, X n with unknown mean µ and unknown variance σ The likelihood function Lµ, σ x) exp πσ ) n/ σ x i µ) Thus, ˆµ x ln Lµ,

### 18 Poisson Process 18 POISSON PROCESS 196. A counting process is a random process N(t), t 0, such that. 1. N(t) is a nonnegative integer for each t;

18 POISSON PROCESS 196 18 Poisson Process A counting process is a random process N(t), t 0, such that 1. N(t) is a nonnegative integer for each t;. N(t) is nondecreasing in t; and 3. N(t) is right-continuous.

### Continuous Distributions

MAT 2379 3X (Summer 2012) Continuous Distributions Up to now we have been working with discrete random variables whose R X is finite or countable. However we will have to allow for variables that can take

### Random variables, probability distributions, binomial random variable

Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

### the number of organisms in the squares of a haemocytometer? the number of goals scored by a football team in a match?

Poisson Random Variables (Rees: 6.8 6.14) Examples: What is the distribution of: the number of organisms in the squares of a haemocytometer? the number of hits on a web site in one hour? the number of

### Discrete probability and the laws of chance

Chapter 8 Discrete probability and the laws of chance 8.1 Introduction In this chapter we lay the groundwork for calculations and rules governing simple discrete probabilities. These steps will be essential

### Probability Calculator

Chapter 95 Introduction Most statisticians have a set of probability tables that they refer to in doing their statistical wor. This procedure provides you with a set of electronic statistical tables that

### Review Solutions, Exam 3, Math 338

Review Solutions, Eam 3, Math 338. Define a random sample: A random sample is a collection of random variables, typically indeed as X,..., X n.. Why is the t distribution used in association with sampling?

### CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,

### 0.1 Solutions to CAS Exam ST, Fall 2015

0.1 Solutions to CAS Exam ST, Fall 2015 The questions can be found at http://www.casact.org/admissions/studytools/examst/f15-st.pdf. 1. The variance equals the parameter λ of the Poisson distribution for

### Baby Bayes using R. Rebecca C. Steorts

Baby Bayes using R Rebecca C. Steorts Last L A TEX d Tuesday 12 th January, 2016 Copyright c 2016 Rebecca C. Steorts; do not distribution without author s permission Contents 1 Motivations and Introduction

### Mathematical Notation and Symbols

Mathematical Notation and Symbols 1.1 Introduction This introductory block reminds you of important notations and conventions used throughout engineering mathematics. We discuss the arithmetic of numbers,

### Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

### III. Famous Discrete Distributions: The Binomial and Poisson Distributions

III. Famous Discrete Distributions: The Binomial and Poisson Distributions Up to this point, we have concerned ourselves with the general properties of categorical and continuous distributions, illustrated

### Homework Set on Probabilistic Failure Models

Homework Set on Probabilistic Failure Models List of problems:. Time to failure:, p.2 (p.5). 2. Time to failure: 2, p.2 (p.6). 3. Time to failure: 3, p.3 (p.7). 4. Scratches, p.4 (p.8). 5. Many identical

### Sums of Independent Random Variables

Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

### Lecture Notes 1. Brief Review of Basic Probability

Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

### MA 485-1E, Probability (Dr Chernov) Final Exam Wed, Dec 10, 2003 Student s name Be sure to show all your work. Full credit is given for 100 points.

MA 485-1E, Probability (Dr Chernov) Final Exam Wed, Dec 10, 2003 Student s name Be sure to show all your work. Full credit is given for 100 points. 1. (10 pts) Assume that accidents on a 600 miles long

### CSE 312, 2011 Winter, W.L.Ruzzo. 7. continuous random variables

CSE 312, 2011 Winter, W.L.Ruzzo 7. continuous random variables continuous random variables Discrete random variable: takes values in a finite or countable set, e.g. X {1,2,..., 6} with equal probability

### MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...

MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 2004-2012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................

### Master s Theory Exam Spring 2006

Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem

### An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random