Lecture 6: Discrete & Continuous Probability and Random Variables

Similar documents
Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

5. Continuous Random Variables

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

An Introduction to Basic Statistics and Probability

Notes on Continuous Random Variables

Math 151. Rumbos Spring Solutions to Assignment #22

Math 431 An Introduction to Probability. Final Exam Solutions

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Chapter 4 Lecture Notes

ECE302 Spring 2006 HW5 Solutions February 21,

Joint Exam 1/P Sample Exam 1

Section 5.1 Continuous Random Variables: Introduction

Statistics 100A Homework 7 Solutions

WHERE DOES THE 10% CONDITION COME FROM?

Lecture 7: Continuous Random Variables

Lecture 8. Confidence intervals and the central limit theorem

Math 461 Fall 2006 Test 2 Solutions

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Lecture Notes 1. Brief Review of Basic Probability

Statistics 100A Homework 8 Solutions

Introduction to Probability

ST 371 (IV): Discrete Random Variables

2WB05 Simulation Lecture 8: Generating random variables

MAS108 Probability I

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 3 Solutions

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MULTIVARIATE PROBABILITY DISTRIBUTIONS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

The Binomial Probability Distribution

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Probability Distribution

ECE302 Spring 2006 HW3 Solutions February 2,

2.6. Probability. In general the probability density of a random variable satisfies two conditions:

Normal distribution. ) 2 /2σ. 2π σ

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Stats on the TI 83 and TI 84 Calculator

STA 256: Statistics and Probability I

1.1 Introduction, and Review of Probability Theory Random Variable, Range, Types of Random Variables CDF, PDF, Quantiles...

Statistics 100A Homework 3 Solutions

University of California, Los Angeles Department of Statistics. Random variables

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

Practice problems for Homework 11 - Point Estimation

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

WEEK #22: PDFs and CDFs, Measures of Center and Spread

The Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University

STAT 830 Convergence in Distribution

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables

1 Sufficient statistics

4. Continuous Random Variables, the Pareto and Normal Distributions

Feb 28 Homework Solutions Math 151, Winter Chapter 6 Problems (pages )

Aggregate Loss Models

Probability for Estimation (review)

M2S1 Lecture Notes. G. A. Young ayoung

You flip a fair coin four times, what is the probability that you obtain three heads.

Section 6.1 Joint Distribution Functions

Random Variables. Chapter 2. Random Variables 1

ECE302 Spring 2006 HW4 Solutions February 6,

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Probability Distribution for Discrete Random Variables

Probability Generating Functions

Normal Distribution as an Approximation to the Binomial Distribution

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

Lesson 20. Probability and Cumulative Distribution Functions

Random variables, probability distributions, binomial random variable

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Covariance and Correlation

Roulette. Math 5 Crew. Department of Mathematics Dartmouth College. Roulette p.1/14

16. THE NORMAL APPROXIMATION TO THE BINOMIAL DISTRIBUTION

Continuous Random Variables

Chapter 5. Random variables

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

Lecture 13: Martingales

Math/Stats 342: Solutions to Homework

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 5 Solutions

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

Chapter 3 RANDOM VARIATE GENERATION

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

Example. A casino offers the following bets (the fairest bets in the casino!) 1 You get $0 (i.e., you can walk away)

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

Principle of Data Reduction

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

1 Prior Probability and Posterior Probability

Hypothesis Testing for Beginners

Probability & Probability Distributions

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

Ch5: Discrete Probability Distributions Section 5-1: Probability Distribution

Notes on metric spaces

MATH 10: Elementary Statistics and Probability Chapter 7: The Central Limit Theorem

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Transcription:

Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 1 / 33

1 Finishing Basics of Expectation and Variance Variance of Discrete Random Variables 2 Continuous Probability Introduction Continuous Probability Densities Cumulative Density Function Expectation of Continuous Random Variables Variance of Continuous Random Variables 3 Properties of Expecctation and Variance Properties of Expectation Properties of Variance D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 2 / 33

Variance of Discrete Random Variables Definition The variance of a random variable is the expected value of its squared deviations from µ. If X is discrete with pdf p X (k), VAR[X ] = σ 2 = E[(X µ) 2 ] = k (k µ) 2 p X (k). Note that µ = E[X ]. Also note that the standard deviation σ is the square root of the variance. VAR[X ] = E[(X E[X ]) 2 ] Just as the expected value has a clear analogy to the center of balance of a distribution, the variance is analogous to the moment of inertia. Imagine placing two distributions on a turntable and pushing on each with identical force 6 from the center. A distribution with low variance would spin faster, and a distribution with high variance would spin slower. D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 3 / 33

Discrete RV Variance Example What is the variance of a six sided die roll? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 4 / 33

Discrete RV Variance Example What is the variance of a six sided die roll? Answer First calculate the expected value of a single die roll. 6 x i p(x i ) = 1(1/6) + 2(1/6) + 3(1/6) 1 + 4(1/6) + 5(1/6) + 6(1/6) E[X ] = µ = 3.5 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 4 / 33

Discrete RV Variance Answer Then the Variance is: 6 (x i µ) 2 p(x i ) = (1 3.5) 2 (1/6) + (2 3.5) 2 (1/6) + (3 3.5) 2 (1/6) 1 + (4 3.5) 2 (1/6) + (5 3.5) 2 (1/6) + (6 3.5) 2 (1/6) = 17.5(1/6) = 2.92 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 5 / 33

Example of Discrete RV Variance Example An urn contains five chips, two red and three white. Two chips are drawn at random, with replacement. Let X denote the number of red chips drawn. What is VAR[X]? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 6 / 33

Example of Discrete RV Variance Example An urn contains five chips, two red and three white. Two chips are drawn at random, with replacement. Let X denote the number of red chips drawn. What is VAR[X]? Answer X is binomial, n = 2 and p = 2 5. E[X ] = np = (2)(2/5) = 4/5 VAR[X ] = 2 ( ) ( ) (x E[X ]) 2 f (x) = (i.8) 2 2 2 i ( ) 3 2 i i 5 5 i=0 ( ) ( ) ( ) = (0.8) 2 2 (2/5) 0 (3/5) 2 + (1.8) 2 2 (2/5) 1 (3/5) 1 + (2.8) 2 2 (2/5) 2 (3/5) 0 0 1 2 = (16/25)(1)(9/25) + (1/25)(2)(6/25) + (36/25)(1)(4/25) = 12 25 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 6 / 33

1 Finishing Basics of Expectation and Variance Variance of Discrete Random Variables 2 Continuous Probability Introduction Continuous Probability Densities Cumulative Density Function Expectation of Continuous Random Variables Variance of Continuous Random Variables 3 Properties of Expecctation and Variance Properties of Expectation Properties of Variance D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 7 / 33

Example Example Some process has equal chance of generating a number between [0,1]. What is the probability the process generate a number less that 0.5? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 8 / 33

Example Example Some process has equal chance of generating a number between [0,1]. What is the probability the process generate a number less that 0.5? Draw a number line. D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 8 / 33

Example Example Some process has equal chance of generating a number between [0,1]. What is the probability the process generate a number less that 0.5? Draw a number line. Probability 1.0 0.8 0.6 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0 Outcome Value D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 9 / 33

Example Example Some process has equal chance of generating a number between [0,1]. What is the probability the process generate a number less that 0.5? Draw a number line. Probability 1.0 0.8 0.6 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0 Outcome Value D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 10 / 33

Example Example Some process has equal chance of generating a number between [0,1]. What is the probability the process generate a number less that 0.5? Draw a number line. Probability 1.0 0.8 0.6 0.4 0.2 0.0 0.0 0.2 0.4 0.6 0.8 1.0 Outcome Value D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 11 / 33

Continuous Probability Definition A probability density function P on a set of real numbers S is called Continuous is there exists a function f(t) such that for any closed interval [a,b] S, P([a, b]) = b a f (t)dt D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 12 / 33

Continuous Probability Definition A probability density function P on a set of real numbers S is called Continuous is there exists a function f(t) such that for any closed interval [a,b] S, Comment f(t) must satisfy two properties: P([a, b]) = b a f (t)dt 1 f (t) 0 for all t 2 f (t)dt = 1 Frequently, we refer to the probability distribution function as the pdf. D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 12 / 33

Continuous Probability Example Example What continuous probability function describes the uniform chance of drawing a number on the range [0,10]? What is the probability that a randomly selected number falls between 4 and 7? What is the probability that a randomly selected number falls between [2,4] or [6,9]? What continuous probability function describes the uniform chance of drawing a number on the range [a,b]? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 13 / 33

Continuous Probability Example 2 Example Is f (x) = 3x 2, 0 x 1 a valid probability function? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 14 / 33

Continuous Probability Example 2 Example Is f (x) = 3x 2, 0 x 1 a valid probability function? What is the probability that 0 x 1/3? What is the probability that 2/3 x 1? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 14 / 33

Continuous Probability Example 2 Example Is f (x) = 3x 2, 0 x 1 a valid probability function? What is the probability that 0 x 1/3? What is the probability that 2/3 x 1? Answer Yes. Recall the two properties f(x) just satisfy: (A) f(x) is always non-negative; (B) The integral of f(x) across the function s domain = 1. 1/3 0 = x 3 1/3 0 = 1/27 1 2/3 = x 3 1 2/3 = 1 8/27 = 19/27. D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 14 / 33

Example 3 Example Suppose we would like a continuous function to describe a variable Y in such a way that it more likely to produce y s near the middle of the range more frequently than near the extremes of the range. What might this look like? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 15 / 33

Example 3 Example Suppose we would like a continuous function to describe a variable Y in such a way that it more likely to produce y s near the middle of the range more frequently than near the extremes of the range. What might this look like? f(y) 0.0 0.5 1.0 1.5 I ve drawn one where f (y) = 6y(1 y). 0.0 0.2 0.4 0.6 0.8 1.0 y D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 15 / 33

Example 3 Example Does this meet the requirements of a probability density function? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 16 / 33

Example 3 Example Does this meet the requirements of a probability density function? Answer Yes! 1 f Y 0 for all y; 2 P(Ω) = 1. Then, with this function, answer the following: What is the probability that a number is less than 0.5? Greater than 0.5? Less than 0.25? Greater than 0.25? Exactly 0.25? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 16 / 33

Cumulative Density Function Definition The cumulative density function (called cdf ) of a continuous, real-valued random variable Y is an indefinite integral of its pdf. F Y (x) = x f y (x)dx = P(X x) Note the difference in notation between a cumulative distribution function (cdf) and a probability distribution function (pdf). Theorem F (y) = y d F (y) = f (y) dy f (t)dt; conversely, D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 17 / 33

Cumulative Density Examples Example A uniform distribution covers the range [0,10]. What is the probability that a random number drawn is less than 4? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 18 / 33

Cumulative Density Examples Example A uniform distribution covers the range [0,10]. What is the probability that a random number drawn is less than 4? Illustration of Cumulative Density Function y 0.0 0.2 0.4 0.6 0.8 1.0 Area = 0.4 Point = 0.4 4 0 f (x)dx =.1x 4 0 =.1(4).1(0) f (x) = F X (x) F X (x) = 0.1x F X (4) = 0.1(4) =.4 0 2 4 6 8 10 x D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 18 / 33

CDF Properties Theorem Let Y be a continuous random variable with cdf F Y (y), Then, 1 P(Y > s) = 1 F Y (s) 2 P(r < Y s) = F Y (s) F Y (r) 3 lim x + F Y (y) = 1 4 lim x F Y (y) = 0 Additionally: The CDF is a monotonically nondecreasing continuous function. Will be useful for easily/quickly calculating the probability a variable takes a value on some interval. D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 19 / 33

An Example In Encinitas the distribution of waves a broski, Tad, takes every hour, Y, is described by the pdf: f Y (y) = y e y, y 0. What is the median of Encinitas wave-income distribution? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 20 / 33

An Example In Encinitas the distribution of waves a broski, Tad, takes every hour, Y, is described by the pdf: f Y (y) = y e y, y 0. What is the median of Encinitas wave-income distribution? In looking for the median, we are looking for the number where the probability below is equal to the probability above. D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 20 / 33

An Example In Encinitas the distribution of waves a broski, Tad, takes every hour, Y, is described by the pdf: f Y (y) = y e y, y 0. What is the median of Encinitas wave-income distribution? In looking for the median, we are looking for the number where the probability below is equal to the probability above. F Y (y) = ye y 0 = e y 0.5 = e y y 0.5 = 0 ( e y ) 0.5 = e y ln(0.5) = y 0.69 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 20 / 33

Expectation of Continuous Random Variables Definition If Y is a continuous random variable with pdf f Y (y), E[Y ] = y f Y (y)dy In the same was as the discrete case, one may think of this as a center of gravity of the distribution. D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 21 / 33

Example A continuous uniform density function, f(x) = 1 is defined on the range [0,1]. What is the expected value of this function? Answer E[X ] = = 1 0 1 0 = 1 2 x 2 1 0 x f (x)dx 1xdx = 1 (1 0) 2 = 0.5 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 22 / 33

Variance of Continuous Random Variables Definition Let X be a real-valued random variable with density function f X (x). Then, the variance, σ 2, is defined by σ 2 = E[(X µ) 2 ] = (x µ) 2 f (x)dx D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 23 / 33

Expectation Identities Theorem For any constant c, E[c] = c D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 24 / 33

Expectation Identities Theorem For any constant c, E[c] = c Theorem For any random variable W, E[W + b] = E[W ] + b D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 24 / 33

Expectation Identities Theorem For any constant c, E[c] = c Theorem For any random variable W, E[W + b] = E[W ] + b Theorem For any constant c and random variable X, E[cX ] = ce[x ] D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 24 / 33

Expectation Identities Theorem For any constant c, E[c] = c Theorem For any random variable W, E[W + b] = E[W ] + b Theorem For any constant c and random variable X, E[cX ] = ce[x ] Theorem For any X and Y that are random variables that have finite expectations E[X + Y ] = E[X ] + E[Y ] D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 24 / 33

Expected Value of a Function of a Random Variable Theorem Suppose X is a random variable with pdf p X (k). Let g(x ) be any function of X. Then the expected value of the random variable g(x) is given by Example E[g(X )] = g(x) f X (x)dx Suppose X = f X (x) is the uniform density function defined on the range [0,1]. Then f X (x) = 1. Suppose g(x) = 1/5x. E[g(X )] = 1 0 1 5 x f X (x)dx = 1 5 [1 2 x 2 ] 1 0 = 1 10 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 25 / 33

Example The Zetas want to burn their gang-signs into a wall. Suppose the amount of fuel in the torch, Y, is a random variable with pdf f Y (y) = 3y 2, 0 < y < 1. In the past they have been able to burn a circle whose radius is 20 times the size of y. How much area can they expect to burn? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 26 / 33

Example The Zetas want to burn their gang-signs into a wall. Suppose the amount of fuel in the torch, Y, is a random variable with pdf f Y (y) = 3y 2, 0 < y < 1. In the past they have been able to burn a circle whose radius is 20 times the size of y. How much area can they expect to burn? Answer By problem setup, g(y ) = 20πY 2 E[g(Y )] = 1 0 = 60πy 5 5 = 12πft 2 20πy 2 3y 2 dy = 60π 1 0 1 y 4 0 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 26 / 33

Variance Recall the definition of Variance: Definition The variance of a random variable is the expected value of its squared deviations from µ. If X is discrete with pdf p X (k), VAR[X ] = σ 2 = E[(X µ) 2 ] = k (k µ) 2 p X (k). Note that µ = E[X ]. Also note that the standard deviation σ is the square root of the variance. VAR[X ] = E[(X E[X ]) 2 ] D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 27 / 33

A Variance Identity Theorem Let X be any random variable (discrete or continuous), having mean µ and for which E[X 2 ] is finite. Then, Proof. VAR[X ] = E[(X E[X ]) 2 ] = E[X 2 ] E[X ] 2 VAR[X ] = E[(X E[X ]) 2 ] = E[X 2 2XE[X ] + E[X ] 2 ] = E[X 2 ] E[2XE[X ]] + E[E[X ] 2 ] = E[X 2 ] E[2X ]E[X ] + E[X ] 2 = E[X 2 ] 2E[X ]E[X ] + E[X ] 2 = E[X 2 ] E[X ] 2 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 28 / 33

An Example Example Given the following pdf, what is the variance? f (x) = 3(1 x) 2, 0 < x < 1 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 29 / 33

An Example Example Given the following pdf, what is the variance? Answer E[X ] = E[X 2 ] = 1 0 1 0 f (x) = 3(1 x) 2, 0 < x < 1 x 3(1 x) 2 dx = 3 x 2 3(1 x) 2 dx = 3 VAR[X ] = E[X 2 ] E[X ] 2 = 3 80 1 0 1 0 x 2y 2 + y 3 = 3 y 2 2y 3 + y 4 dx = 1 10 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 29 / 33

Variance Identities Theorem For and random variable X and constant c VAR[cX ] = c 2 VAR[X ] VAR[X + c] = VAR[X ] Proof. For ease, let E[X ] = µ. Then E[cX ] = cµ, and V [cx ] = E[(cX cµ) 2 ] = E[c 2 (X µ) 2 ] = c 2 E[(X µ) 2 ] = c 2 VAR[X ] D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 30 / 33

Variance Identities Theorem Let X and Y be two independent random variables. Then, VAR[X + Y ] = VAR[X ] + VAR[Y ] Proof. Let E[X] = a and E[Y] = b. VAR[X + Y ] = E[(X + Y ) 2 ] (a + b) 2 = E[X 2 ] + 2E[XY ] + E[Y 2 ] a 2 2ab b 2 = E[X 2 ] a 2 + E[Y 2 ] b 2 = VAR[X ] + VAR[Y ] D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 31 / 33

Introduction What do we require of a probability distribution/density model? It must be non-negative for all outcomes in its sample space It must sum or integrate to one across the sample space. By this definition, f (y) = y 4 + 7 y 3 2, 0 y 1 is a valid pdf. f (y) 0, y0 1, and; 1 y 0 4 + 7 y 3 2 dy = 1 But, how useful is this model? D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 32 / 33

A Model s Merit A pdf has utility as a probability model if it actually models the behavior of the real (or political) world. A surprisingly small number of distributions describe these real world outcomes. Many measurements/outcomes are the result of the same set of assumptions about the data generating process Number of fraud incidents Number of latrines built Feeding patterns of zebra muscles D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September Variables 17, 2015 33 / 33