Introduction to Probability



Similar documents
5. Continuous Random Variables

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

Chapter 4 Lecture Notes

Probability and statistics; Rehearsal for pattern recognition

Lecture 6: Discrete & Continuous Probability and Random Variables

MAS108 Probability I

Lecture Notes 1. Brief Review of Basic Probability

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random Variables. Chapter 2. Random Variables 1

A Tutorial on Probability Theory

Lecture 7: Continuous Random Variables

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Probability for Estimation (review)

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

E3: PROBABILITY AND STATISTICS lecture notes

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Sums of Independent Random Variables

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

e.g. arrival of a customer to a service station or breakdown of a component in some system.

Random Variate Generation (Part 3)

Important Probability Distributions OPRE 6301

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

Joint Exam 1/P Sample Exam 1

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

Section 5.1 Continuous Random Variables: Introduction

STA 256: Statistics and Probability I

THE CENTRAL LIMIT THEOREM TORONTO

32. PROBABILITY P(A B)

Math 431 An Introduction to Probability. Final Exam Solutions

Review of Random Variables

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Lecture 13: Martingales

Chapter 5. Random variables

M2S1 Lecture Notes. G. A. Young ayoung

Machine Learning.

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

ECE302 Spring 2006 HW5 Solutions February 21,

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

Random variables, probability distributions, binomial random variable

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

Introduction to Queueing Theory and Stochastic Teletraffic Models

Math/Stats 342: Solutions to Homework

Probability and Random Variables. Generation of random variables (r.v.)

ST 371 (IV): Discrete Random Variables

ECE302 Spring 2006 HW3 Solutions February 2,

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Lecture 8. Confidence intervals and the central limit theorem

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

BNG 202 Biomechanics Lab. Descriptive statistics and probability distributions I

Notes on Continuous Random Variables

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

Practice problems for Homework 11 - Point Estimation

Transformations and Expectations of random variables

Probability Generating Functions

INSURANCE RISK THEORY (Problems)

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Stats on the TI 83 and TI 84 Calculator

VISUALIZATION OF DENSITY FUNCTIONS WITH GEOGEBRA

Continuous Random Variables

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

SOLUTIONS. f x = 6x 2 6xy 24x, f y = 3x 2 6y. To find the critical points, we solve

Exploratory Data Analysis

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Lecture 2: Discrete Distributions, Normal Distributions. Chapter 1

Average rate of change of y = f(x) with respect to x as x changes from a to a + h:

Problem sets for BUEC 333 Part 1: Probability and Statistics

6 PROBABILITY GENERATING FUNCTIONS

Statistics 100A Homework 8 Solutions

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

( ) = 1 x. ! 2x = 2. The region where that joint density is positive is indicated with dotted lines in the graph below. y = x

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

Section 6.1 Joint Distribution Functions

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

Probability for Computer Scientists

4 Sums of Random Variables

WEEK #22: PDFs and CDFs, Measures of Center and Spread

Aggregate Loss Models

2. Discrete random variables

), 35% use extra unleaded gas ( A

Master s Theory Exam Spring 2006

MAS131: Introduction to Probability and Statistics Semester 1: Introduction to Probability Lecturer: Dr D J Wilkinson

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Statistics 100A Homework 7 Solutions

Lectures on Stochastic Processes. William G. Faris

Law of Large Numbers. Alexandra Barbato and Craig O Connell. Honors 391A Mathematical Gems Jenia Tevelev

University of California, Los Angeles Department of Statistics. Random variables

ECE302 Spring 2006 HW4 Solutions February 6,

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS

Lecture Note 1 Set and Probability Theory. MIT Spring 2006 Herman Bennett

1 Maximum likelihood estimation

Tenth Problem Assignment

Transcription:

Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence Brownian motion An event is a set of outcomes belonging to a sample space. Events must be repeatable and have statistical regularity, i.e. a large number of experiments have regularity in outcome patterns. We define the probability of an event A as the average number of times that the outcome belongs to A in the limit for large n: number of times outcome is in A P(A) = lim n n Examples: P(roulette wheel outcome is red) = 18 38 P(rain tomorrow) 57 365 (bogus) EE 179, May 2, 2014 Lecture 15, Page 1

Mathematics of Probability: Axiomatic Approach Random events are defined on a probability space: sample space S of possible outcomes (finite or infinite) family (set) of events {Ai } that are subsets of S probability measure P( ) on events The probability measure has three properties: P(A) 0 P(S) = 1 if Ai A j is empty, then P( i A i ) = i P(A i) We formally write probability space as triple (S,{A i },P( )} A very common notation for the sample space is Ω, and a generic outcome is ω This axiomatic approach was introduced by Kolmogorov around 1930. Probability has been used for thousand of years. Proverbs 16:32: We may throw the dice, but the Lord determines how they fall. EE 179, May 2, 2014 Lecture 15, Page 2

Random Variables Practical definition of random variable: the numerical output of a probabilistic experiment. coin flip (tails=0, heads=1) or sum of two dice (2,3,...,12) amount of snow fall at a location over a duration noise voltage at an instant or integral of noise over an interval Mathematical definition: a real-valued function defined on the sample space of a probability space. Examples: X : Ω R X(ω) R for every ω Ω Sample space for toss of two dice is {i,j : 1 i,j 6}. The sum is the r.v. i+j. For the BSC, the input is the r.v. X = x and the output is Y = y. We have derived the joint probability distribution for X and Y. If the values of a r.v. are discrete, the r.v. is called discrete. Otherwise the r.v. is continuous or mixed. EE 179, May 2, 2014 Lecture 15, Page 3

Probability Mass Function The probability distribution of a discrete random is complete described by its probality mass function (pmf or PMF). P{X = x k } = p X (x k ), where values of X are {x k } In this special of the axioms of probability, p X (x k ) 0, k p X(x k ) = 1 Important discrete random variables: Bernoulli: Ω = {0,1}, p(1) = p, p(0) = 1 p. Binomial: S n = n k=1 X k where X k are independent Bernoulli. Geometric: p(n) = (1 p) n 1 p for n = 1,2,... and 0 p 1. Poisson: p(n) = λn n! e λ, where n 0 and λ 0. Solving problems about discrete r.v.s usually requires manipulating sums (combinatorics). EE 179, May 2, 2014 Lecture 15, Page 4

Cumulative Distribution Function For continuous random variables p(x) = 0 for all x, so we cannot use pmf. The cumulative distribution function (cdf or CDF) can describe both discrete and continuous r.v.s. The CDF of a real-valued r.v. X is defined by Properties of CDF. F X (x) = P{X x}, x Monotone: if x 1 x 2 then F(x 1 ) F(x 2 ) Limits: F( ) = lim F(x) = 0, F( ) = lim F(x) = 1 x x Interval: P{a < X b} = P{X b} P{X a} = F(b) F(a) Point: P{X = x} = P{X x} P{X < x} = F(x) F(x ), where F(x ) = lim u x F(u). A random variable is continuous if its cdf F(x) is continuous for every x. Another definition is P{X < x}; this is used by Russian mathematicians. EE 179, May 2, 2014 Lecture 15, Page 5

Types of CDFs The CDF of any discrete r.v. is an increasing staircase function. The CDF of a continous r.v. is a smooth nondecreasing function. The CDF of a mixed r.v. is continuous between jumps; p(x) > 0 for some x. nondecreasing = increasing but not necessarily strictly increasing. EE 179, May 2, 2014 Lecture 15, Page 6

Probability Density Function If X is a continuous r.v., then P([x 1,x 2 ]) = P{x 1 X x 2 } = F X (x 2 ) F X (x 1 ). If F X (x) is differentiable, then F X (x 2 ) F X (x 1 ) = x2 p x (u)du, where p X (x) = df X x 1 dx We call p X (x) is the probability density function (pdf, PDF) of X; p X (x) is the probability per unit width of a narrow interval around x. EE 179, May 2, 2014 Lecture 15, Page 7

Properties of PDF Nonnegative: p(x) 0, since F(x) is increasing. CDF is the antiderivative of the PDF. F(x) = x p(u) du Impulses: if P{X = x 0 } = p 0 > 0 then p X (x) = p 0 δ(x x 0 ). Mixed r.v.: if F(x) is differentiable except at discrete points {x k }, then p(x) = p(x)+ k p k δ(x x k ) where p(x) is a nonnegative continuous function and p(x)dx = 1 k p k Most books use f X (x) for pdf and p X (x) for pmf. EE 179, May 2, 2014 Lecture 15, Page 8

Statistics of Random Variables The complete description of a random variable is its CDF, which specifies probabilities of all intervals, e.g., X > x 0. To compare two r.v.s we often need single numbers (statistics) associated with each random variable. The most common statistics are: Mean: (average, expected value): Second moment: Variance: X = E(X) = E(X 2 ) = xp(x)dx or x 2 p(x)dx or n= n= x n p(x n ) x 2 n p(x n) Var(X) = E((X X) 2 ) = E(X 2 ) (E(X)) 2 Median: The median X med is the value satisfying EE 179, May 2, 2014 P{X < X med } = P{X > X med } Lecture 15, Page 9

Examples of Continuous Random Variables Uniform random variable has a constant density on an interval. We write X Unif[a,b] if p X (x) is constant on [a,b] and 0 elsewhere. 1 a x b p X (x;a,b) = b a 0 x < a or x > b Examples of uniform random variables are final position of roulette wheel or quantization error. b x E(X) = a b a dx = x 2 b 2(b a) = b2 a 2 a 2(b a) = b+a 2 b E(X 2 x 2 ) = b a dx = x 3 b 3(b a) = b3 a 3 3(b a) = b2 +ba+a 2 3 Var(X) = b2 +ba+a 2 3 a a b2 +2ba+a 2 4 = b2 2ba+a 2 12 = (b a)2 12 EE 179, May 2, 2014 Lecture 15, Page 10

Examples of Continuous Random Variables (cont.) Exponential random variable has one parameter λ. { λe λx x 0 f(x;λ) = 0 x < 0 The CDF for x 0 is F(x;λ) = x λe λu du = x 0 λe λu du = e λu x 0 = 1 e λx The mean is xλe λx dx = xe λx 0 The variance (integrating by parts twice) is 0 0 ( e λx ) = 1 λ 0 (x λ) 2 λe λx dx = 1 λ 2 EE 179, May 2, 2014 Lecture 15, Page 11

Examples of Continuous Random Variables (cont.) Gaussian random variable has two parameters µ and σ. Its pdf is ( ) N(x;µ,σ 2 1 ) = exp (x µ)2 2πσ 2 σ 2 The Gaussian PDF is centered at and has maximum value at x = µ. The mean is µ (obvious) and the variance is σ 2. The inflection points of the density graph are at ±σ. The density decreases faster than exponentially as x ±. EE 179, May 2, 2014 Lecture 15, Page 12

Joint Random Variables In communication systems we usually have two random signals defined on the sample sample space: Transmitted signal x(t) Received signal y(t). For times t 1 and t 2, the values x(t 1 ) and y(t 2 ) are joint random variables. Joint r.v.s are characterized by a joint CDF: F XY (x,y) = P{X x, Y y} = P{left lower quadrant bounded by (x,y)} If X and Y are jointly continuous, their joint PDF is given by p XY (x,y) = 2 x y F XY(x,y) EE 179, May 2, 2014 Lecture 15, Page 13

Properties of Joint PDF P{(X,Y) [a,b] [c,d]} 0, that is, p XY (x,y) 0, and F(b,d) F(a,d) F(b,c)+F(a,b) 0 p(x,y)dxdy = 1 EE 179, May 2, 2014 Lecture 15, Page 14