Outline Random Variables M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Random variables. CDF and pdf. Joint random variables. Correlated, independent, orthogonal. Correlation, convolution, correlation coefficient. Normal distribution. 1 2 s 1 S Random Variable X(.) R X : S R Function mapping the elements of the sample space S to the real line R. Equivalent Event: real value associated with the elementary event(s). Probability of the real value = sum of probabilities of the original associated elementary event(s) in the sample space. 3 Examples Example: Throw die, outcome 1-6 dots. Random Variable: maps i dots to i, i =1,, 6. Example: Measurement of any physical quantity with additive random error (noise). Example: pitch, card game, collect tricks. Probabilities of equivalent events: 4
Cumulative Probability Distribution Function Definition: The cumulative distribution function (CDF) of a random variable is a function defined for each real number x as follows Properties of the CDF 1. as 2. as 3. is a nondecreasing function of 5 6 Probability Density Function (pdf) Continuous random variable Nonnegative function f defined on the real line. For every real interval f X (x) x x + dx x X x dx f ( x dx P X ) x 7 1. 2. Properties of the pdf 3. First two properties follow from the axioms of probability. Integrate: 8
Uniform Distribution 1, a x b f X ( x) b a 0, elsewhere x a 1 x 2 b 1/(b a) x Example: Spin the Pointer 1 2 2 1 Probability of value in any subinterval of is proportional to its length. Area of rectangle must be unity. 1 2 2 1 MATLAB >> rand(m, n) % a=0, b=1, m by n 9 10 Density & Distribution Expectation of a Random Variable Expected value or mean of. Justified by relative frequency. Discrete Continuous 11 12
Function of a Random Variable Properties of Expectation Expected value of the function Discrete: Continuous: 13 14 Jensen s Inequality For a random variable Convex function: and a function 1 Moments moment: expectation of power First moment is the mean Second moment is the mean square Discrete Continuous 15 16
Variance Second moment about the mean Standard Deviation: square root of variance Discrete Continuous 17 Properties of the Variance Uncorrelated: 18 Variance Property: Proof, 19 Example: Uniform Distribution Find the mean and variance Mean: Mean Square: Variance: 20
Moment Generating Function L Laplace transform of pdf (also defined with Use 2-sided Laplace transform table. 21 Series Expansion and Moments Moments: 22 Characteristic Function = Fourier transform: Use Fourier transform tables Moments: Normal or Gaussian Density 23 24 Symmetric about the mean. Peak value (at )= (larger for sharper peak) Mode (most likely value) = mean Standard normal distribution, zero-mean, unit variance /
Why is it important? Fits many physical phenomena. Central limit theorem. Completely described by mean and variance. Independent uncorrelated. Right Tail Probability Probability of exceeding a given value. >> p = normspec([-inf,1],0,1,'outside') Complementary cumulative distribution. / =Prob. of false alarm. 25 26 Inverse monotonically decreasing invertible. Inverse is important in some applications (signal detection: prob. of false alarm). Erf : error function Error Function 0 0 0.5 1 1.5 2 2.5 3 27 28 Erfc: complementary error function (invertible) 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 erf(x) erfc(x) x
Relation to Normal Distribution Probability Less than Upper Bound is 0.9452 Normal Distribution: mean zero, variance 0.5 N For negative use Density 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1. 0-2 -1.5-1 -0.5 0 0.5 1 1.5 2 Critical Value 29 x 2. 3. Erf and Gaussian Density / / 30 MATLAB: Computing Probabilities (similar for Maple) >> erf(x) % Error function >> erfc(x) % Complementary error function >> 0.5*(1+erf(x/sqrt(2)) ) % St. Normal P (t < x) >> 0.5*erfc(x/sqrt(2)) % St. Normal P (t > x) >> Qinv=sqrt(2)*erfinv(1-2*P) % Inverse Q(P) 31 Example: Test Scores Test scores are normally distributed with N >> fun = @(x) exp(-(x-83).^2/128)./sqrt(128*pi); >> integral(fun,83-16,83+16) % within 2 sigma ans = 0.9545 32
Pseudorandom Number Generators Impulsive pdf >> rand % Uniform distribution over [0,1] >> randn % Standard normal Shifting and Scaling (also see random): >> y = sigmay*randn + ybar For Use impulse for discrete or mixed random variables. -2 f(y) 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 F(y) 0-1.5-1 -0.5 0 0.5 1 1.5 2 2.5 3 Y 1 0.9 0.8 0.5 (Y) N N 33 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0-2 -1.5-1 -0.5 0 0.5 1 1.5 2 2.5 3 34 Example: Half-wave rectifier Half-wave rectifier driven by noise 1 N R + R + R 0 Density/Distribution 35 36
Multiple Random Variables Multivariate Distributions Multivariate: vector of random variables Bivariate: 2 variables Discrete case: joint prob. = 2-dim. array x 2 A = n by 1 vector Obtain marginal prob. by adding col. or row x 20 dx 2 dx1 x 10 x 1 Generalize: 37 38 Marginal pdf Marginal Distributions Example 39 Conditional Distribution Conditional probability Conditional density of given,,,,,,,,,,,,,,,,,,,,, 40
Bayes Rule for Random Vars. Independence,,,,, Independent,,,,,,,,,, 41 42 dy Sum of Independent Random Vars. y z + d z dx z =x + y x Y Y Y 43 Independent vs. Uncorrelated Independent Uncorrelated Independent Uncorrelated. Uncorrelated Independent?? Not true, in general. True for multivariate Gaussian. 44
Central Limit Theorem Given n independent random variables Property of Convolutions: Convolution of a large number of positive functions is approximately Gaussian. Central Limit Theorem: Z is asymptotically Gaussian. Lim N 45 Correlation Coefficient Normalized measure of correlation between. Value between 1 and 1 Zero for uncorrelated Reduces to variance property for 46 Zero Correlation Coefficient Unity Correlation Coefficient Uncorrelated For uncorrelated 47 48
Range of Correlation Coefficient Proof: Quadratic in a has no real roots Discriminant: negative or zero discriminant for quadratic in a (zero for, equal roots) 49 Orthogonal Random Variables 50 Correlation and Covariance Covariance Matrix Generalization of 2 nd moment & variance to vector case. 51 Can be written in terms of variances and correlation coefficients. Diagonal for uncorrelated (& independent) variables. 52
Proof: Quadratic form Multivariate Normal / 53 54 Generalization of normal distribution to n linearly independent random variables. If are mutually uncorrelated they are also independent. Independent/Uncorrelated If Gaussian are mutually uncorrelated they are also mutually independent. Bivariate Gaussian / / 55 56
Properties of Multivariate Normal Density N completely defined by and. If joint pdf is normal: Uncorrelated Independent. All marginal and conditional pdfs are normal. Linear transformation of normal vector gives a normal vector (next presentation). Conclusion Probabilistic description of random variables. Moments, characteristic function, moment generating function. Correlation and covariance. Correlated, independent, orthogonal. Normal (Gaussian) random variable. 57 58 References Brown & Hwang, Introduction to Random Signals and Applied Kalman Filtering, Wiley, NY, 2012. Stark & Woods, Probability and Random Processes, Prentice Hall, Upper Saddle River, NJ, 2002. R. M. Gray & L. D. Davisson, Random Processes: A mathematical Approach for Engineers, Prentice Hall, Englewood Cliffs, NJ, 1986. M. H. De Groot, M. J. Schervish, Probability & Statistics, Addison-Wesley, Boston, 2002. S. M. Kay, Fundamentals of Statistical Signal Processing: Detection Theory, Prentice Hall, 1998. A. Papoulis and S.U. Pillai, Probability, Random Variables, and Stochastic Processes, 4th Ed., McGraw Hill, Boston, MA, 2002. 59