P(a X b) = f X (x)dx. A p.d.f. must integrate to one: f X (x)dx = 1. Z b



Similar documents
5. Continuous Random Variables

Section 6.1 Joint Distribution Functions

Notes on Continuous Random Variables

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

Statistics 100A Homework 7 Solutions

Some probability and statistics

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Lesson 20. Probability and Cumulative Distribution Functions

Lecture 6: Discrete & Continuous Probability and Random Variables

Section 5.1 Continuous Random Variables: Introduction

Lecture 7: Continuous Random Variables

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

The Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University

Joint Exam 1/P Sample Exam 1

ECE302 Spring 2006 HW5 Solutions February 21,

Introduction to Probability

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

Math 432 HW 2.5 Solutions

Probability for Estimation (review)

Math 431 An Introduction to Probability. Final Exam Solutions

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Maximum Likelihood Estimation

Probability Calculator

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

MAS108 Probability I

1. A survey of a group s viewing habits over the last year revealed the following

WEEK #22: PDFs and CDFs, Measures of Center and Spread

The Black-Scholes pricing formulas

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Part III. Lecture 3: Probability and Stochastic Processes. Stephen Kinsella (UL) EC4024 February 8, / 149

2.6. Probability. In general the probability density of a random variable satisfies two conditions:

SOLUTIONS. f x = 6x 2 6xy 24x, f y = 3x 2 6y. To find the critical points, we solve

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Lecture Notes 1. Brief Review of Basic Probability

Continuous Random Variables

Normal distribution. ) 2 /2σ. 2π σ

Principle of Data Reduction

Lecture Notes on Probability for 8.044: Statistical Physics I

Statistics 100A Homework 8 Solutions

Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day.

Practice with Proofs

Feb 28 Homework Solutions Math 151, Winter Chapter 6 Problems (pages )

MAINTAINED SYSTEMS. Harry G. Kwatny. Department of Mechanical Engineering & Mechanics Drexel University ENGINEERING RELIABILITY INTRODUCTION

An Introduction to Basic Statistics and Probability

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 3 Solutions

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

A Coefficient of Variation for Skewed and Heavy-Tailed Insurance Losses. Michael R. Powers[ 1 ] Temple University and Tsinghua University

12.5: CHI-SQUARE GOODNESS OF FIT TESTS

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

Student name: Earlham College. Fall 2011 December 15, 2011

Lecture 8. Confidence intervals and the central limit theorem

1.5 / 1 -- Communication Networks II (Görg) Transforms

Stats on the TI 83 and TI 84 Calculator

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

Math 120 Final Exam Practice Problems, Form: A

Chapter 3 RANDOM VARIATE GENERATION

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 1 Solutions

INSURANCE RISK THEORY (Problems)

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Some stability results of parameter identification in a jump diffusion model

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

First Order Circuits. EENG223 Circuit Theory I

" Y. Notation and Equations for Regression Lecture 11/4. Notation:

An Introduction to Partial Differential Equations

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

TTT4110 Information and Signal Theory Solution to exam

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

4 Sums of Random Variables

Aggregate Loss Models

Lecture 8: More Continuous Random Variables

Radioactivity III: Measurement of Half Life.

Exponential Distribution

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Definition 8.1 Two inequalities are equivalent if they have the same solution set. Add or Subtract the same value on both sides of the inequality.

8.7 Exponential Growth and Decay

MATH 425, PRACTICE FINAL EXAM SOLUTIONS.

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Nuclear Physics Lab I: Geiger-Müller Counter and Nuclear Counting Statistics

VISUALIZATION OF DENSITY FUNCTIONS WITH GEOGEBRA

Experimental Uncertainty and Probability

Normal Distribution as an Approximation to the Binomial Distribution

Volumetric Path Tracing

RANDOM INTERVAL HOMEOMORPHISMS. MICHA L MISIUREWICZ Indiana University Purdue University Indianapolis

Calculus 1: Sample Questions, Final Exam, Solutions

STAT 830 Convergence in Distribution

Nonparametric adaptive age replacement with a one-cycle criterion

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

1. Let A, B and C are three events such that P(A) = 0.45, P(B) = 0.30, P(C) = 0.35,

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

7.1 The Hazard and Survival Functions

The Heat Equation. Lectures INF2320 p. 1/88

Lecture 8. Generating a non-uniform probability distribution

RAJALAKSHMI ENGINEERING COLLEGE MA 2161 UNIT I - ORDINARY DIFFERENTIAL EQUATIONS PART A

STA 256: Statistics and Probability I

Transcription:

Continuous Random Variables The probability that a continuous random variable, X, has a value between a and b is computed by integrating its probability density function (p.d.f.) over the interval [a,b]: P(a X b) = Z b a f X (x)dx. A p.d.f. must integrate to one: Z f X (x)dx = 1.

Continuous Random Variables (contd.) The probability that the continuous random variable, X, has any exact value, a, is 0: P(X = a) = lim P(a X a+ x) x 0 In general = lim x 0 = 0. Z a+ x P(X = a) f X (a). a f X (x)dx

Probability Density The probability density at a multiplied by ε approximately equals the probability mass contained within an interval of ε width centered on a: ε f X (a) Z a+ε/2 a ε/2 f X (x)dx P(a ε/2 X a+ε/2)

Cumulative Distribution Function A continuous random variable, X, can also be defined by its cumulative distribution function (c.d.f.): F X (a) = P(X a) = Z a f X (x)dx. For any c.d.f., F X ()=0 and F X ( )= 1. The probability that a continuous random variable, X, has a value between a and b is easily computed using the c.d.f.: P(a X b) = = Z b a Z b f X (x)dx f X (x)dx = F X (b) F X (a). Z a f X (x)dx

Cumulative Distribution Function (contd.) The p.d.f., f X (x), can be derived from the c.d.f., F X (x): Z x f X (x) = d f X (s)ds dx = df X(x). dx

Joint Probability Densities Let X and Y be continuous random variables. The probability that a X b and c Y d is found by integrating the joint probability density function for X and Y over the interval [a,b] w.r.t. x and over the interval [c, d] w.r.t. y: P(a X b,c Y d) = Z b Z d a c f XY (x,y)dydx. Like a one-dimensional p.d.f., a twodimensional joint p.d.f. must also integrate to one: Z Z f XY (x,y)dxdy = 1.

Marginal Probability Densities f X (x) = f Y (y) = Z Z f XY (x,y)dy f XY (x,y)dx Conditional Probability Densities f X Y (x y) = f XY(x,y) f Y (y) f XY (x,y) = R f XY(x,y)dx f XY (x,y) = f X Y (x y) f Y (y)

Exponential Density A constant fraction of a radioactive sample decays per unit time: d f(t) = 1 dt τ f(t). What fraction of the radioactive sample will remain after time t? d(e t τ ) dt = 1 τ e t τ

Exponential Density (contd.) The function, f(t) = e t τ, satisfies the differential equation, but it does not integrate to one: Z 0 e t τdt = τe t τ 0 = τe τ + τ = τ. So that R f T(t)dt = 1, we divide f(t) by τ: f T (t) = 1 τ e t τ.

Exponential Density (contd.) The time, T, at which an atom of a radioactive element decays is a continuous random variable with the following p.d.f.: f T (t) = 1 τ e t τ. The corresponding c.d.f. is: Z a 1 F T (a) = τdt 0 τ e t = e t τ a 0 = 1 e a τ. The c.d.f. gives the probability that an atom of a radioactive element has already decayed.

Example The lifetime of a radioactive element is a continuous random variable with the following p.d.f.: f T (t) = 1 100 e 100 t. The probability that an atom of this element will decay within 50 years is: Z 50 1 P(0 t 50) = 0 100 e t = 1 e 0.5 = 0.39. 100 dt

Exponential Density (contd.) The half-life, λ, is defined as the time required for half of a radioactive sample to decay: Since P(0 t λ) = 1/2. Z λ 1 P(0 t λ) = 0 100 e t = 1 e 100 1 λ = 1/2, 100 dt it follows that λ=100ln2 or 69.31 years.

1 0.9 line 1 line 2 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 10 20 30 40 50 60 70 80 90 100 Figure 1: Exponential p.d.f., 1 10 e 1 10 t, and c.d.f., 1 e 1 10 t.

Memoryless Property of the Exponential If X is an exponentially distributed random variable, then Proof: P(X > s+t X > t) = P(X > s). P(X > s+t X > t) = = = Since P(X > t) = 1 P(X t), P(X > s+t) P(X > t) P(X > s+t,x > t) P(X > t) P(X > t X > s+t)p(x > s+t) P(X > t) P(X > s+t) P(X > t). = 1 (1 e (s+t)/τ ) 1 (1 e t/τ ) = e s/τ = P(X > s).

Memoryless Property of the Exponential In plain language: Knowing how long we ve already waited doesn t tells us anything about how much longer we are going to have to wait, e.g., for a bus.

Expected Value Let X be a continuous random variable. The expected value of X, is defined as follows: Variance X = µ= Z x f X (x)dx The variance of X is defined as the expected value of the squared difference of X and X : [X X ] 2 Z = σ 2 = [x X ] 2 f X (x)dx

Gaussian Density A random variable X with p.d.f., f X (x) = 1 σ /2σ2 e (x µ)2 2π is called a Gaussian (or normal) random variable with expected value, µ, and variance, σ 2.

Expected Value for Gaussian Density Let the p.d.f., f X (X), equal 1 2πσ e (x µ)2 /(2σ 2). The expected value, X, can be derived as follows: X = 1 Z xe (x µ)2 /(2σ 2) dx. 2πσ

Expected Value for Gaussian Density (contd.) Writing x as (x µ)+µ: X = 1 2πσ Z + µ 1 2πσ Z (x µ)e (x µ)2 /(2σ 2) dx e (x µ)2 /(2σ 2) dx. The first term is zero, since (after substitution of u for x µ) it is the integral of the product of an odd and even function. The second term is µ, since Z 1 Z e (x µ)2 /(2σ 2) dx = f X (x)dx = 1. 2πσ Consequently, X = µ.

C.d.f. for Gaussian Density Because the Gaussian integrates to one and is symmetric about zero, its c.d.f., F X (a), can be written as follows: Z a { 1 f X (x)dx = 2 R 0 a f X(x)dx if a < 0 1 2 + R a 0 f X(x)dx otherwise. Equivalently, we can write: { Z a 1 f X (x)dx = 2 R a 0 f X (x)dx if a < 0 0 f X (x)dx otherwise. 1 2 + R a

C.d.f. for Gaussian Density (contd). To evaluate R a 0 f X (x)dx, recall that the Taylor series for e x is: e x x n = n=0 n!. The Taylor series for a Gaussian is therefore: f X (x) = 1 e x2 /2 2π ( x 2 /2 ) n = 1 2π n=0 n! = 1 ( 1) n x 2n 2π n=0 n! 2. n

C.d.f. for Gaussian Density (contd.) Consequently: Z a 0 f X (x)dx = 1 2π Z a = 1 ( 1) n 2π n=0 n! 0 ( 1) n x 2n n=0 n! 2 dx n x 2n+1 2 n (2n+1) = 1 ( 1) n a 2n+1 2π n=0 n! 2 n (2n+1). a 0

1 0.9 line 1 line 2 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0-3 -2-1 0 1 2 3 Figure 2: Gaussian p.d.f. and c.d.f., µ = 0 and σ 2 = 1, computed using Taylor series.