# Continuous random variables

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 Continuous random variables So far we have been concentrating on discrete random variables, whose distributions are not continuous. Now we deal with the so-called continuous random variables. A random variable X is called a continuous random variable if its distribution function F is a continuous function on R, or equivalently, if P(X x), for every x R. Of course there are random variables which are neither discrete nor continuous. But in this course we will concentrate on discrete random variables and continuous random variables. Among continuous random variables, we will concentrate on a subclass. Definition 1 A random variable X is called an absolutely continuous random variable if there is a nonnegative function f on R such that P(X x) x f(t)dt, for every x R. The function f in the definition above is called the probability density of the absolutely continuous random variable X and it must satisfy f(t)dt 1. Definition 2 A nonnegative function f on R satisfying is called a probability density function. f(t)dt 1. It can be shown that if f is a probability density function, then then there exists an absolutely continuous random variable X with f as its probability density function. Given a nonnegative function g on R such that g(t)dt c is a finite positive number, then the function 1 c g(x) is a probability density function. From Definition 1, we can easily see that an absolutely continuous random variable is a continuous random variable. However, there are continuous random variables which are not absolutely continuous and we call these random variables singularly continuous random variables. In this course, we will not deal with singularly continuous random variables. If X is an absolutely continuous random variable with density f, then its distribution function is given by F (x) x f(t)dt, for every x R. 1

2 Conversely, if X is an absolutely continuous random variable with distribution function F, then its density function is given by F (x) at those points x where F is differentiable and at those points x where F is not differentiable. If X is an absolutely continuous random variable with density f, then for any real numbers a < b we have P(X (a, b)) P(X [a, b]) b a f(t)dt. Example A point is chosen randomly from the unit disk so that each point in the disk is equally likely. Let X the distance of the point from the center of the disk. Then the distribution function of X is given by, x F (x) x 2, < x 1 1, x > 1. This random variable is an absolutely continuous random variable with density 2x, x (, 1), otherwise. Example 4 Suppose that X is an absolutely continuous random variable whose density function is given by C(4x 2x 2 ), < x < 2, otherwise. (a) What is the value of C? (b) Find P(X > 1). Solution. (a) Since f is a probability distribution we must have f(x)dx C 2 Since 2 (4x 2x2 )dx 8, we have C 8. (b) P(X > 1) 1 f(x)dx (4x 2x2 )dx 1 2. (4x 2x 2 )dx 1. Definition 5 If X is an absolutely continuous random variable with density f, then the expectation of X is defined as E[X] xf(x)dx. Example 6 Suppose that X is an absolutely continuous random variable with density given by 2x, x (, 1), otherwise. Then the expectation of X is E[X] xf(x)dx 1 2x 2 dx 2. 2

3 The following theorem tells us that if X is an absolutely continuous random variable with density f, then we can use the density f of X to find the expectation of any function of X. Theorem 7 If X is an absolutely continuous random variable with density f, then for any realvalued function φ on R, E[φ(X)] φ(x)f(x)dx. Definition 8 If X is an absolutely continuous random variable, then its variance is defined as Just as in the discrete case, we have Var(X) E[(X E[X]) 2 ]. Var(X) E[X 2 ] (E[X]) 2. As a consequence of Theorem 7, we immediately we get the following result which we have seen in the discrete case. Proposition 9 If X is an absolutely continuous random variable, and a, b are real numbers, then E[aX + b] ae[x] + b, Var(aX + b) a 2 Var(X). Example 1 Find the variance of the random variable in Example 6. Solution. E[X 2 ] 1 2x dx 1 2. Thus Var(X) 1 2 ( 2 ) Example 11 Suppose that X is an absolutely continuous random variable with density given by 1, x (, 1), otherwise. Find the expectation of e X. Solution. E[e X ] 1 e x dx e 1. Now we will introduce some important classes of absolutely continuous random variables. The first class is the uniform random variables. A random variable X is said to be uniformly distributed over the interval (a, b) if it is absolutely continuous with density given by 1 b a, x (a, b), otherwise. If X is uniformly distributed over the interval (a, b), then E[X] a + b 2 (b a)2, var(x). 12

4 Example 12 Buses arrive at a specified bus stop at 15 minutes intervals starting at 7 am. That is, they arrive at 7, 7:15, 7:, and so on. If a passenger arrives at a stop at a time that is uniformly distributed between 7 and 7:, find the probability that he waits (a) less than 5 minutes for a bus; (b) more than 1 minutes for a bus. Example 1 A stick of length 1 is split at a point X that is uniformly distributed over (, 1). Suppose that p is a point in (, 1). Determine the expected length of the piece that contains the point p. Now we introduce the normal random variables. Consider the function It is easy to see that the integral g(x) e x2 /2, x R. c g(x)dx converges and thus c is a finite positive number. There is no simple formula for the antiderivative of g and so we can not evaluate the value of c by the fundamental theorem of calculus. The easiest way to evaluate c is by a very special trick in which we we write c as a two-dimensional integral and introduce polar coordinates. c 2 e x2 /2 dx e y2 /2 dy ( π ) e r2 /2 rdθ dr π e r2 /2. re r2 /2 dr e (x2 +y 2 )/2 dxdy Therefore the function 1 e x2 /2, x R is a probability density and it is called the standard normal density. An absolutely continuous random variable whose density is a standard normal density is called a standard normal random variable. There is no simple formula for the distribution function of a standard normal random variable. It is traditional to denote the distribution function of a standard normal random variable by Φ(x) 1 x e t2 /2 dt, x R. One can find the values of Φ(x) by tables given in probability textbooks or by computer. It is easy to check that Φ satsfies the following relation Φ( x) 1 Φ(x), x R. Suppose that X is a standard normal random variable, µ is a real number and σ >. Let consider the random variable Y µ + σx. For any y R, P(Y y) P(µ + σx y) P(X (y µ)/σ) 4

5 1 (y µ)/σ e t2 /2 dt. By the chain rule and the fundamental of calculus we know that Y is an absolutely continuous random variable whose density is given by f(y) 1 σ /2σ2 e(y µ)2, y R. The probability density above is called a normal density with parameters µ and σ 2, and will be denoted by n(µ, σ 2 ) or n(y; µ, σ 2 ). An absolutely continuous random variable whose density is a normal density n(µ, σ 2 ) is called a normal random variable with parameters µ and σ 2. The standard normal density is simply a normal density with parameters and 1 n(, 1). Thus we have shown that if X is a standard normal random variable, µ is a real number and σ >, then the random variable Y µ + σx is a normal random variable with parameters µ and σ 2. Similar to the paragraph above, we can show that, if X is a normal random variable with parameters µ and σ 2, then Z (X µ)/σ is a standard normal random variable. This fact will be very useful in finding the probabilities involving general normal random variables. Theorem 14 If X is a normal random variable with parameters µ and σ 2, then E[X] µ, Var(X) σ 2. Proof Since X can be written as µ + σz, where Z is a standard normal random variable, so we only need to prove the theorem when X is a standard normal random variable. So we assume that X is a standard normal random variable. Since the function x 1 e x2 /2 is an odd function, we get that E[X] x 1 e x2 /2 dx. Using integration by parts (with u x, dv xe x2 /2 ) we get Var(X) E[X 2 ] 1 1 ( xe x2 /2 + 1 e x2 /2 dx 1. x 2 e x2 /2 dx ) e x2 /2 dx Example 15 If X is a normal random variable with parameters µ and σ 2 9, find (a) P(2 < X < 5); (b) P(X > ); (c) P( X > 6). Solution (a). ( 2 P(2 < X < 5) P < X < 5 ) ( P 1 < X < 2 ) Φ( 2 ) Φ( 1 ).799 5

6 (b). ( X P(X > ) P > ) 1 Φ( 1) Φ(1).841. ( X P ) > 1 (c). P( X > 6) P(X > 6) + P(X < 6) ( ) ( ) X X P > 2 + P < 2 1 Φ(2) + Φ( 2) 2(1 Φ(2)).456. Example 16 When we say that grades for a test is curved, we mean that, after the test, the instructor finds the mean and variance of the scores, and then assigns the letter grade A to those whose test score is greater than µ + σ, B to those whose score is between µ and µ + σ, C to those whose score is between µ σ and µ, and D to those whose score is between µ 2σ and µ σ, and F to those whose score is below µ σ. Assuming that the test score follows a normal distribution, find the probability of each letter grade. Solution P(X > µ + σ) 1 Φ(1).1587 P(µ < X µ + σ) Φ(1) Φ().41 P(µ σ < X µ) Φ() Φ( 1).41 P(µ 2σ < X µ σ) Φ( 1) Φ( 2).159 P(X < µ 2σ) Φ( 2).228. Example 17 Suppose that X is a normal random variable with parameters µ and σ 2 4. Find the value of c so that P( X > c).1. Solution P( X > c) P(X > c) + P(X < c) P( X 2 > c 2 ) + P(X 2 < c 2 ) 2(1 Φ( c 2 )). In order that P( X > c).1, we need 2(1 Φ( c 2 )).1, that is, Φ( c 2 ).995. From the table we get that c , and so c Now we introduce the exponential random variables. For any λ >, the following function λe λx x > x is a probability density function and it is called the exponential density with parameter λ. For any λ >, an absolutely continuous random variable X is called an exponential random variable with parameter λ if its density function is an exponential density with parameter λ. Suppose that X is an exponential random variable with parameter λ, then for any x >, we have P(X x) x λe λt dt 1 e λx. 6

7 So the distribution function of X is given by 1 e λx x F (x) x < Theorem 18 Suppose that X is an exponential random variable with parameter λ, then E[X] 1 λ and Var(X) 1 λ 2. Proof For any positive integer n, using integration by parts, we have E[X n ] x n λe λx dx x n e λx + n λ nx n 1 e λx dx x n 1 λe λx dx n λ E[Xn 1 ]. Letting n 1 and then n 2, we get E[X] 1 λ and E[X2 ] 2 λ E[X] 2 λ 2. Thus Var(X) 1 λ 2. Example 19 Suppose that the length of a phone call in minutes is an exponential random variable with parameter λ 1 1. If someone arrives immediately ahead of you at a public telephone booth, find the probability that you have to wait (a) more than 1 minutes; (b( between 1 and 2 minutes. We say that a positive random variable X is memoryless if P(X > s + t X > t) P(X > s), for all s, t >. It is very easy to check that any exponential random variable is memoryless. In fact, we can prove that if a positive random variable X is memoryless in the sense above, it must be an exponential random variable. This suggests that exponential random variables are very useful in modeling lifetimes of certain equipment when the memoryless property is approximately satisfied. To introduce the next type of random variable, we recall the definition of Gamma function. For any α >, the Gamma function is defined by Γ(α) x α 1 e x dx. There is no explicit formula for this function. One can easily see that Γ(1) 1. In fact, we can easily find a formula for Γ(n) for any positive integer n. Before this, we observe the following Proposition 2 For any α >, we have Γ(α + 1) αγ(α). proof Γ(α + 1) x α e x dx x α e x + αγ(α). 7 αx α 1 e x dx

8 Using this proposition, we immediately get Γ(n) (n 1)!. One can easily check that, for any α > and λ >, the function λ α Γ(α) xα 1 e λx x > x is a probability density function and it is called a Gamma density with parameters α and λ. For any α > and λ >, an absolutely continuous random variable is called a Gamma random variable with parameters α and λ if its density is a Gamma density with parameters α and λ. A Gamma random variable with parameters α 1 and λ is simply an exponential random variable with parameter λ. Theorem 21 Suppose that X is a Gamma random variable with parameters α and λ, then E[X] α λ and Var(X) α λ 2. Proof E[X] 1 λxe λx (λx) α 1 dx Γ(α) 1 λe λx (λx) α dx λγ(α) Γ(α + 1) α λγ(α) λ. Similarly, we can find E[X 2 ] and then find that Var(X) α λ 2. Example 22 Suppose that X is a normal random variable with parameters µ and σ 2. Find the density of Y X 2. Solution Let F be the distribution function of X and f the density function of X. Y is a nonnegative random variable. For any y >, we have P(Y y) P(X 2 y) P( y X y) F ( y) F ( y). Thus the density of Y is 1 2 g(y) y (f( y) + f( y)) y > y. Plugging in the expression for f, we get that the density of Y is 1 σ g(y) y e y/(2σ2 ) y > y. (.1) This example shows that, if X is a normal random variable with parameters µ and σ 2, then X 2 is a Gamma random variable with parameters α 1 2 and λ 1. By comparing (.1) with 2σ 2 definition of Gamma density we see that Γ( 1 2 ) π. 8

9 From this and Proposition 2 we immediately get that for any positive odd integer n Γ( n π(n 1)! 2 ) 2 ( ) n 1 n 1. 2! Now we deal with the following problem: Suppose that X is an absolutely continuous random variable with density f and φ is a function on R, how do we find the density of the random variable Y φ(x). Our strategy is to first find the distribution of Y in terms of f and then find the density of Y. We illustrate this by a few examples. Example 2 Let X be uniformly distributed on (, 1) and λ > a constant. Find the density of Y λ 1 ln(1 X). Solution. Obviously Y is a nonnegative random variable. For any y >, Hence the density of Y is given by P(Y y) P( λ 1 ln(1 X) y) P(ln(1 X) λy) P(1 X e λy ) P(X 1 e λy ) 1 e λy. g(y) λe λy y > y. Example 24 Let X be an exponential random variable with parameter λ and β a constant. Find the density of Y X 1/β. Solution. We deal with this example in two separate cases: β > and β <. The two cases are similar and we only go over the case β <. Y is obviously a nonnegative random variable. For any y >, Hence the density of Y is given by Remark on Notations P(Y y) P(X 1/β y) P(X y β ) g(y) e λyβ. β λy β 1 e λyβ y > y. 9

sheng@mail.ncyu.edu.tw 1 Content Introduction Expectation and variance of continuous random variables Normal random variables Exponential random variables Other continuous distributions The distribution

### Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

### Continuous Random Variables

Probability 2 - Notes 7 Continuous Random Variables Definition. A random variable X is said to be a continuous random variable if there is a function f X (x) (the probability density function or p.d.f.)

### 3. Continuous Random Variables

3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such

### Random Variables of The Discrete Type

Random Variables of The Discrete Type Definition: Given a random experiment with an outcome space S, a function X that assigns to each element s in S one and only one real number x X(s) is called a random

### Section 5.1 Continuous Random Variables: Introduction

Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

### Continuous Random Variables

Continuous Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Continuous Random Variables 2 11 Introduction 2 12 Probability Density Functions 3 13 Transformations 5 2 Mean, Variance and Quantiles

### Chapter 4 Expected Values

Chapter 4 Expected Values 4. The Expected Value of a Random Variables Definition. Let X be a random variable having a pdf f(x). Also, suppose the the following conditions are satisfied: x f(x) converges

### 5. Continuous Random Variables

5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

### Continuous Distributions

MAT 2379 3X (Summer 2012) Continuous Distributions Up to now we have been working with discrete random variables whose R X is finite or countable. However we will have to allow for variables that can take

### 1 Definition of Conditional Expectation

1 DEFINITION OF CONDITIONAL EXPECTATION 4 1 Definition of Conditional Expectation 1.1 General definition Recall the definition of conditional probability associated with Bayes Rule P(A B) P(A B) P(B) For

### Continuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage

4 Continuous Random Variables and Probability Distributions Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4 - and Cengage Continuous r.v. A random variable X is continuous if possible values

### Renewal Theory. (iv) For s < t, N(t) N(s) equals the number of events in (s, t].

Renewal Theory Def. A stochastic process {N(t), t 0} is said to be a counting process if N(t) represents the total number of events that have occurred up to time t. X 1, X 2,... times between the events

### CSE 312, 2011 Winter, W.L.Ruzzo. 7. continuous random variables

CSE 312, 2011 Winter, W.L.Ruzzo 7. continuous random variables continuous random variables Discrete random variable: takes values in a finite or countable set, e.g. X {1,2,..., 6} with equal probability

### Homework n o 7 Math 505a

Homework n o 7 Math 505a Two players (player A and player B) play a board game. The rule is the following: both player start at position 0. Then, at every round, a dice is thrown: If the result is different

### Statistics 100A Homework 6 Solutions

Chapter 5 Statistics A Homework Solutions Ryan Rosario 3. The time in hours) required to repair a machine is an exponential distributed random variable with paramter λ. What is Let X denote the time in

### Notes on Continuous Random Variables

Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

### Introduction to Probability

Motoya Machida January 7, 6 This material is designed to provide a foundation in the mathematics of probability. We begin with the basic concepts of probability, such as events, random variables, independence,

### Lecture 16: Expected value, variance, independence and Chebyshev inequality

Lecture 16: Expected value, variance, independence and Chebyshev inequality Expected value, variance, and Chebyshev inequality. If X is a random variable recall that the expected value of X, E[X] is the

### Gamma distribution. Let us take two parameters α > 0 and β > 0. Gamma function Γ(α) is defined by. 1 = x α 1 e x dx = y α 1 e βy dy 0 0

Lecture 6 Gamma distribution, χ -distribution, Student t-distribution, Fisher F -distribution. Gamma distribution. Let us tae two parameters α > and >. Gamma function is defined by If we divide both sides

### Lecture 13: Some Important Continuous Probability Distributions (Part 2)

Lecture 13: Some Important Continuous Probability Distributions (Part 2) Kateřina Staňková Statistics (MAT1003) May 14, 2012 Outline 1 Erlang distribution Formulation Application of Erlang distribution

### Fisher Information and Cramér-Rao Bound. 1 Fisher Information. Math 541: Statistical Theory II. Instructor: Songfeng Zheng

Math 54: Statistical Theory II Fisher Information Cramér-Rao Bound Instructor: Songfeng Zheng In the parameter estimation problems, we obtain information about the parameter from a sample of data coming

### Review Exam Suppose that number of cars that passes through a certain rural intersection is a Poisson process with an average rate of 3 per day.

Review Exam 2 This is a sample of problems that would be good practice for the exam. This is by no means a guarantee that the problems on the exam will look identical to those on the exam but it should

### Random Variables and Their Expected Values

Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution Function Discrete and Continuous Random Variables The Probability Mass Function The (Cumulative) Distribution

### Jointly Distributed Random Variables

Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................

### Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.

UCLA STAT 11 A Applied Probability & Statistics for Engineers Instructor: Ivo Dinov, Asst. Prof. In Statistics and Neurology Teaching Assistant: Neda Farzinnia, UCLA Statistics University of California,

### Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average

PHP 2510 Expectation, variance, covariance, correlation Expectation Discrete RV - weighted average Continuous RV - use integral to take the weighted average Variance Variance is the average of (X µ) 2

### Lecture 10: Other Continuous Distributions and Probability Plots

Lecture 10: Other Continuous Distributions and Probability Plots Devore: Section 4.4-4.6 Page 1 Gamma Distribution Gamma function is a natural extension of the factorial For any α > 0, Γ(α) = 0 x α 1 e

### Chapters 5. Multivariate Probability Distributions

Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are

### Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

### Continuous Probability Distributions (120 Points)

Economics : Statistics for Economics Problem Set 5 - Suggested Solutions University of Notre Dame Instructor: Julio Garín Spring 22 Continuous Probability Distributions 2 Points. A management consulting

### Topic 2: Scalar random variables. Definition of random variables

Topic 2: Scalar random variables Discrete and continuous random variables Probability distribution and densities (cdf, pmf, pdf) Important random variables Expectation, mean, variance, moments Markov and

### Fourier Series. Chapter Some Properties of Functions Goal Preliminary Remarks

Chapter 3 Fourier Series 3.1 Some Properties of Functions 3.1.1 Goal We review some results about functions which play an important role in the development of the theory of Fourier series. These results

### Will Landau. Feb 26, 2013

,, and,, and Iowa State University Feb 26, 213 Iowa State University Feb 26, 213 1 / 27 Outline,, and Iowa State University Feb 26, 213 2 / 27 of continuous distributions The p-quantile of a random variable,

### Sufficient Statistics and Exponential Family. 1 Statistics and Sufficient Statistics. Math 541: Statistical Theory II. Lecturer: Songfeng Zheng

Math 541: Statistical Theory II Lecturer: Songfeng Zheng Sufficient Statistics and Exponential Family 1 Statistics and Sufficient Statistics Suppose we have a random sample X 1,, X n taken from a distribution

### Exercises in Probability Theory Nikolai Chernov

Exercises in Probability Theory Nikolai Chernov All exercises (except Chapters 16 and 17) are taken from two books: R. Durrett, The Essentials of Probability, Duxbury Press, 1994 S. Ghahramani, Fundamentals

### 9740/02 October/November MATHEMATICS (H2) Paper 2 Suggested Solutions. Ensure calculator s mode is in Radians. (iii) Refer to part (i)

GCE Level October/November 8 Suggested Solutions Mathematics H (974/) version. MTHEMTICS (H) Paper Suggested Solutions. Topic : Functions and Graphs (i) 974/ October/November 8 (iii) (iv) f(x) g(x)

### 10 BIVARIATE DISTRIBUTIONS

BIVARIATE DISTRIBUTIONS After some discussion of the Normal distribution, consideration is given to handling two continuous random variables. The Normal Distribution The probability density function f(x)

### Bivariate Unit Normal. Lecture 19: Joint Distributions. Bivariate Unit Normal - Normalizing Constant. Bivariate Unit Normal, cont.

Lecture 19: Joint Distributions Statistics 14 If X and Y have independent unit normal distributions then their joint distribution f (x, y) is given by f (x, y) φ(x)φ(y) c e 1 (x +y ) Colin Rundel April,

### Some notes on the Poisson distribution

Some notes on the Poisson distribution Ernie Croot October 2, 2008 1 Introduction The Poisson distribution is one of the most important that we will encounter in this course it is right up there with the

### Lecture Notes 1. Brief Review of Basic Probability

Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

### POL 571: Expectation and Functions of Random Variables

POL 571: Expectation and Functions of Random Variables Kosuke Imai Department of Politics, Princeton University March 10, 2006 1 Expectation and Independence To gain further insights about the behavior

### MATH 201. Final ANSWERS August 12, 2016

MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6-sided dice, five 8-sided dice, and six 0-sided dice. A die is drawn from the bag and then rolled.

### TRANSFORMATIONS OF RANDOM VARIABLES

TRANSFORMATIONS OF RANDOM VARIABLES 1. INTRODUCTION 1.1. Definition. We are often interested in the probability distributions or densities of functions of one or more random variables. Suppose we have

### Lecture 8: Continuous random variables, expectation and variance

Lecture 8: Continuous random variables, expectation and variance Lejla Batina Institute for Computing and Information Sciences Digital Security Version: spring 2012 Lejla Batina Version: spring 2012 Wiskunde

### The t Test. April 3-8, exp (2πσ 2 ) 2σ 2. µ ln L(µ, σ2 x) = 1 σ 2. i=1. 2σ (σ 2 ) 2. ˆσ 2 0 = 1 n. (x i µ 0 ) 2 = i=1

The t Test April 3-8, 008 Again, we begin with independent normal observations X, X n with unknown mean µ and unknown variance σ The likelihood function Lµ, σ x) exp πσ ) n/ σ x i µ) Thus, ˆµ x ln Lµ,

### 5 Indefinite integral

5 Indefinite integral The most of the mathematical operations have inverse operations: the inverse operation of addition is subtraction, the inverse operation of multiplication is division, the inverse

### 17.Convergence of Random Variables

17.Convergence of Random Variables In elementary mathematics courses (such as Calculus) one speaks of the convergence of functions: f n : R R, then n f n = f if n f n (x) = f(x) for all x in R. This is

### Common probability distributionsi Math 217/218 Probability and Statistics Prof. D. Joyce, 2016

Introduction. ommon probability distributionsi Math 7/8 Probability and Statistics Prof. D. Joyce, 06 I summarize here some of the more common distributions used in probability and statistics. Some are

### The Poisson process. Chapter Definition of the Poisson process

Chapter 3 The Poisson process The next part of the course deals with some fundamental models of events occurring randomly in continuous time. Many modelling applications involve events ( arrivals ) happening

### Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

### Lecture 10: Characteristic Functions

Lecture 0: Characteristic Functions. Definition of characteristic functions. Complex random variables.2 Definition and basic properties of characteristic functions.3 Examples.4 Inversion formulas 2. Applications

### STAT 571 Assignment 1 solutions

STAT 571 Assignment 1 solutions 1. If Ω is a set and C a collection of subsets of Ω, let A be the intersection of all σ-algebras that contain C. rove that A is the σ-algebra generated by C. Solution: Let

### OPERATIONS RESEARCH CALCULATIONS

OPERATIONS RESEARCH CALCULATIONS H A N D B O O K Dennis Blumenfeld CRC Press Boca Raton London New York Washington, D.C. Library of Congress Cataloging-in-Publication Data Blumenfeld, Dennis. Operations

### MATH 56A SPRING 2008 STOCHASTIC PROCESSES 31

MATH 56A SPRING 2008 STOCHASTIC PROCESSES 3.3. Invariant probability distribution. Definition.4. A probability distribution is a function π : S [0, ] from the set of states S to the closed unit interval

### Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

### Conditional expectation

A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined

### 4. Poisson Processes

4. Poisson Processes 4.1 Definition 4.2 Derivation of exponential distribution 4.3 Properties of exponential distribution a. Normalized spacings b. Campbell s Theorem c. Minimum of several exponential

### Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01 Probability and Statistics for Engineers Winter 2010-2011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete

### Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

### Expected Value. Let X be a discrete random variable which takes values in S X = {x 1, x 2,..., x n }

Expected Value Let X be a discrete random variable which takes values in S X = {x 1, x 2,..., x n } Expected Value or Mean of X: E(X) = n x i p(x i ) i=1 Example: Roll one die Let X be outcome of rolling

### Probabilities and Random Variables

Probabilities and Random Variables This is an elementary overview of the basic concepts of probability theory. 1 The Probability Space The purpose of probability theory is to model random experiments so

### Some Notes on Taylor Polynomials and Taylor Series

Some Notes on Taylor Polynomials and Taylor Series Mark MacLean October 3, 27 UBC s courses MATH /8 and MATH introduce students to the ideas of Taylor polynomials and Taylor series in a fairly limited

### Change of Continuous Random Variable

Change of Continuous Random Variable All you are responsible for from this lecture is how to implement the Engineer s Way (see page 4) to compute how the probability density function changes when we make

### Random Variables and their Distributions

Chapter 1 Random Variables and their Distributions 1.1 Random Variables Definition 1.1. A random variable X is a function that assigns one and only one numerical value to each outcome of an experiment,

### Sampling Distribution of a Normal Variable

Ismor Fischer, 5/9/01 5.-1 5. Formal Statement and Examples Comments: Sampling Distribution of a Normal Variable Given a random variable. Suppose that the population distribution of is known to be normal,

### VI. Transcendental Functions. x = ln y. In general, two functions f, g are said to be inverse to each other when the

VI Transcendental Functions 6 Inverse Functions The functions e x and ln x are inverses to each other in the sense that the two statements y = e x, x = ln y are equivalent statements In general, two functions

### Sums of Independent Random Variables

Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

### RENEWAL THEORY AND ITS APPLICATIONS

RENEWAL THEORY AND ITS APPLICATIONS PETER NEBRES Abstract. This article will delve into renewal theory. It will first look at what a random process is and then explain what renewal processes are. It will

### Applications of the Central Limit Theorem

Applications of the Central Limit Theorem October 23, 2008 Take home message. I expect you to know all the material in this note. We will get to the Maximum Liklihood Estimate material very soon! 1 Introduction

### The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

### Answers to some even exercises

Answers to some even eercises Problem - P (X = ) = P (white ball chosen) = /8 and P (X = ) = P (red ball chosen) = 7/8 E(X) = (P (X = ) + P (X = ) = /8 + 7/8 = /8 = /9 E(X ) = ( ) (P (X = ) + P (X = )

### A List of Problems in Real Analysis

A List of Problems in Real Analysis W.Yessen & T.Ma June 26, 215 This document was first created by Will Yessen, who now resides at Rice University. Timmy Ma, who is still a student at UC Irvine, now maintains

### CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,

### Notes on the second moment method, Erdős multiplication tables

Notes on the second moment method, Erdős multiplication tables January 25, 20 Erdős multiplication table theorem Suppose we form the N N multiplication table, containing all the N 2 products ab, where

### MATH 461: Fourier Series and Boundary Value Problems

MATH 461: Fourier Series and Boundary Value Problems Chapter III: Fourier Series Greg Fasshauer Department of Applied Mathematics Illinois Institute of Technology Fall 2015 fasshauer@iit.edu MATH 461 Chapter

### Random Variables, Expectation, Distributions

Random Variables, Expectation, Distributions CS 5960/6960: Nonparametric Methods Tom Fletcher January 21, 2009 Review Random Variables Definition A random variable is a function defined on a probability

### Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day.

16 The Exponential Distribution Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day. Let T be the time (in days) between hits. 2.

### 1 Basic concepts from probability theory

Basic concepts from probability theory This chapter is devoted to some basic concepts from probability theory.. Random variable Random variables are denoted by capitals, X, Y, etc. The expected value or

### Finite Markov Chains and Algorithmic Applications. Matematisk statistik, Chalmers tekniska högskola och Göteborgs universitet

Finite Markov Chains and Algorithmic Applications Olle Häggström Matematisk statistik, Chalmers tekniska högskola och Göteborgs universitet PUBLISHED BY THE PRESS SYNDICATE OF THE UNIVERSITY OF CAMBRIDGE

### N E W S A N D L E T T E R S

N E W S A N D L E T T E R S 73rd Annual William Lowell Putnam Mathematical Competition Editor s Note: Additional solutions will be printed in the Monthly later in the year. PROBLEMS A1. Let d 1, d,...,

### Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

### Poisson Processes. Chapter 5. 5.1 Exponential Distribution. The gamma function is defined by. Γ(α) = t α 1 e t dt, α > 0.

Chapter 5 Poisson Processes 5.1 Exponential Distribution The gamma function is defined by Γ(α) = t α 1 e t dt, α >. Theorem 5.1. The gamma function satisfies the following properties: (a) For each α >

### Introduction to Monte-Carlo Methods

Introduction to Monte-Carlo Methods Bernard Lapeyre Halmstad January 2007 Monte-Carlo methods are extensively used in financial institutions to compute European options prices to evaluate sensitivities

### Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0

Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall

### Examination 110 Probability and Statistics Examination

Examination 0 Probability and Statistics Examination Sample Examination Questions The Probability and Statistics Examination consists of 5 multiple-choice test questions. The test is a three-hour examination

### Statistics - Written Examination MEC Students - BOVISA

Statistics - Written Examination MEC Students - BOVISA Prof.ssa A. Guglielmi 26.0.2 All rights reserved. Legal action will be taken against infringement. Reproduction is prohibited without prior consent.

### Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

### Normal approximation to the Binomial

Chapter 5 Normal approximation to the Binomial 5.1 History In 1733, Abraham de Moivre presented an approximation to the Binomial distribution. He later (de Moivre, 1756, page 242 appended the derivation

### Probability is concerned with quantifying the likelihoods of various events in situations involving elements of randomness or uncertainty.

Chapter 1 Probability Spaces 11 What is Probability? Probability is concerned with quantifying the likelihoods of various events in situations involving elements of randomness or uncertainty Example 111

### Definition The covariance of X and Y, denoted by cov(x, Y ) is defined by. cov(x, Y ) = E(X µ 1 )(Y µ 2 ).

Correlation Regression Bivariate Normal Suppose that X and Y are r.v. s with joint density f(x y) and suppose that the means of X and Y are respectively µ 1 µ 2 and the variances are 1 2. Definition The

### Sequences of Functions

Sequences of Functions Uniform convergence 9. Assume that f n f uniformly on S and that each f n is bounded on S. Prove that {f n } is uniformly bounded on S. Proof: Since f n f uniformly on S, then given

### Bivariate Distributions

Chapter 4 Bivariate Distributions 4.1 Distributions of Two Random Variables In many practical cases it is desirable to take more than one measurement of a random observation: (brief examples) 1. What is

### Prof. Tesler. Math 186 and 283 Winter Prof. Tesler Poisson & Exponential Distributions Math 186 / Winter / 31

Math 186: 4.2 Poisson Distribution: Counting Crossovers in Meiosis 4.2 Exponential and 4.6 Gamma Distributions: Distance Between Crossovers Math 283: Ewens & Grant 1.3.7, 4.1-4.2 Prof. Tesler Math 186

### 6.4 The Infinitely Many Alleles Model

6.4. THE INFINITELY MANY ALLELES MODEL 93 NE µ [F (X N 1 ) F (µ)] = + 1 i

### Notes for Math 450 Lecture Notes 3

Notes for Math 45 Lecture Notes 3 Renato Feres 1 Moments of Random Variables We introduce some of the standard parameters associated to a random variable. The two most common are the expected value and

### Lecture 3: Fourier Series: pointwise and uniform convergence.

Lecture 3: Fourier Series: pointwise and uniform convergence. 1. Introduction. At the end of the second lecture we saw that we had for each function f L ([, π]) a Fourier series f a + (a k cos kx + b k

### Probability and Statistics

CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b - 0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute - Systems and Modeling GIGA - Bioinformatics ULg kristel.vansteen@ulg.ac.be