Collection of Formulas ST080 Probability Theory Thommy Perlinger August 7, 0
Contents Multivariate Random Variables Conditioning. Conditionaldistributions.... Conditionalexpectations... 3 Transforms 3 3. Themomentgeneratingfunction... 3 3. Theprobabilitygeneratingfunction... 3 3.3 Sumsofrandomnumberofrandomvariables... 4 3.4 Branching processes..... 4 4 Order Statistics 5 5 The Multivariate Normal Distribution 6 6 Convergence 6 7 Inequalities 7 8 Discrete Distributions 8 9 Continuous Distributions 9
Multivariate Random Variables The transformation theorem. LetX be a continuous random vector with density function X (x) with its mass concentrated on R. Further, let =( ) be a bijection from to R and set Y = (X). The density function of Y is given by Y (y) = X ( (y) (y) (y)) J y where =( ) istheuniqueinverseof and where Conditioning J = (x) (y) =. Conditional distributions...... Let and be jointly distributed random variables. The conditional distribution of given =, thatis =, isinthediscretecasefor () 0 given by = () = ( ) () and in the continuous case for () 0 given by = () = ( ) (). Conditional expectations Let and be jointly distributed random variables and a real valued function. The conditional expectation of ( ) given = R () = () in the continuous case ( ( ) = ) = P all () = () in the discrete case Suppose that. Then [ ( )] = ( ) Furthermore, if ( ) then ( )=[( )] + [ ( )] We further have that ( () ) = () ( ) ( ) = ( ) if and are independent
3 Transforms 3. The moment generating function The moment generating function of a random variable is () = provided there exists 0 such that the expectation exists and is finite for. The moment generating function can be used to generate moments by using the property that () (0) = ( ) Let be a random variable whose moment generating function exists, and and real constants. Then + () = () Let be independent random variables whose moment generating functions exist and real constants. Set Then = + + + = () = Y ( ) = X Let X =( ) 0 be a random vector. The moment generating function of X is X (t) = ( )= + + + ³ = t0 X provided there exists 0 such that the expectation exists and is finite for, = 3. The probability generating function Let be a nonnegative, integer-valued random variable. function of a random variable is () = = = X () =0 The probability generating The probability generating function can be used to generate probabilities by using the property that () = () (0)! 3
Furthermore, if for some = the probability generating function can be used to generate factorial moments by using the property that () () = [ ( ) ( +)] Let be independent, nonnegative, integer-valued random variables and real constants. If = + + + = X = then () = Y ( ) = 3.3 Sums of random number of random variables Let be i.i.d. random variables whose moment generating function exists. Furthermore, let be a nonnegative, integer-valued random variable independent of.set 0 =0and = + + + for. Then () = [ ()] If, moreover, () and then ( )= () () If, in addition, () and (), then 3.4 Branching processes ( )= () ()+( ()) () A branching process is called a Galton-Watson process if. All individuals give birth according to the same probability distribution, independently of each other, and.......the number of children produced by an individual is independent of the number of individuals in their generation. Notation Let represent the number of children produced by individuals. Let the common probability function of be given by () =0,andthe common probability generating function be given by (). 4
Let () represent the number of individuals in generation where here (0) =. Furthermore, let the probability generating function of () be given by (). For such a branching process we have If = ( ) then Furthermore, if = ( ) then () = ( ()) = ( ()) = ( ( )) = ( ()) = + + + Denote by the probability of ultimate extinction of a branching process. Then satisfies the equation = () is the smallest nonnegative root of the equation = (). =for and for 4 Order Statistics Let be i.i.d. random variables with distribution function and density function (or probability function ). For =,let () = the k:th smallest of () () () is called the order statistic and () the k:th order variable, =. The marginal distribution of the k:th order variable, = is given by () () = (+ ) ( ()) () The joint density of the extremes is given by () () ( ) = ( ) ( () ()) () () The density of the range,i.e. = () (),isgivenby () = ( ) Z ( ( + ) ()) () ( + ) The joint density of the order statistic is given by Y () () ( )=! ( ) = 5
5 The Multivariate Normal Distribution For X (μ Λ) the moment generating function is given by X (t) = t0 + t0 Λt Furthermore, if det Λ 0, the density function is given by X (x) = µ ½ exp ¾ det Λ (x μ)0 Λ (x μ) x R 6 Convergence Let be random variables. Definition converges almost surely (a.s.) to the random variable as iff Notation: as. Pr ({ : () () as }) = Definition converges in probability to the random variable as iff, 0 lim Pr ( ) =0 or lim Pr ( )= Notation: as. Definition 3 converges in -mean to the random variable as iff Notation: as. 0 as Definition 4 converges in distribution to the random variable as iff () () as for all ( ) where ( ) is the continuity set of. Notation: as. 6
7 Inequalities Chebyshev s inequality. Let be a random variable with () = and () =,bothfinite. Then for any 0 Pr ( ) Markov s inequality. Let be a positive random variable with ( ). Then for any 0 Pr ( ) ( ) Hölder s inequality. Let and be any two random variables, and let and satisfy + =.Then ( ) ( ) ( ( )) ( ( )) Cauchy-Schwarz inequality. For any two random variables and, ( ) ( ) Minkowski s inequality. Let and be any two random variables. Then for ( ( + )) ( ( )) +( ( )) Jensen s inequality. For any random variable if () is a convex function, then ( ()) ( ()) Equality holds if and only if, for every line + that is tangent to () at = (), Pr ( () = + ) =. 7
8 Discrete Distributions Following is a list of discrete distributions, abbreviations, their probability functions, means, variances, moment generating functions, and probability generating functions. An asterisk indicates that the expression is too complicated to present here; in some cases a closed formula does not even exist. Whenever the parameter is present it is understood that = Distribution, notation Probability function () () () () One point, () () = 0 Discrete uniform, [] = () = = + P = P = Symmetric Bernoulli ( ) = () = 0 + @ Bernoulli, () 0 (0) = () = + + Binomial, ( ) = 0 () = =0 + ( + ) Geometric, () 0 () = =0 ln First success, () 0 () = = ln Negative binomial, ( ) = 0 () = + =0; ³ ln ³ Poisson, () 0 Hypergeometric, () () = = = () =! =0 ( ) ( ) =0 =0
9 Continuous Distributions Following is a list of continuous distributions, abbreviations, their density functions, means, variances, and, when they exist, moment generating functions. An asterisk indicates that the expression is too complicated to present here; in some cases a closed formula does not even exist. Distribution, notation Density function () () () Uniform/Rectangular ( ) () = + ( ) ( ) (0 ) () = 0 ( ) () = 0 3 Triangular, ( ) () = ³ + ( ) () = 0 6 + ( ) 4 ³ ( ) Exponential, (), 0 () = 0 ( ) Gamma, Γ ( ) 0 0 () = Γ() 0 ( ) Chi-square, () = () = Γ( ) 0 ( ) Laplace, () 0 () = 0 Beta, ( ) 0 0 () = Γ(+) Γ()Γ() ( ) 0 + (+) (++) Weibull, ( ), 0, 0 () = 0 Γ ( +) (Γ ( +) Γ ( +) )
Distribution, notation Density function () () () Rayleigh, (), 0 () = 0 4 Normal, () = 0 ( ) + Normal, (0 ) () = 0 Log-normal, () = ( ln ) 0 + ³ 0 @ Logistic, ( ) 0 () = ( ) (+ ( ) ) 3 Γ ( ) Γ ( + ) + Γ( ³ ) (+) Student s, () = () = Γ( ) + 0 @ Fisher s, ( ) = () = )( ) Γ( )Γ( ) + (+) 0 Γ( + (+) ( )( 4) ³ @ 4 Cauchy, ( ) () = +( ) @ @ @ 0 Cauchy, (0 ) () = + @ @ @ Pareto, ( ) 0 0 () = + ( )( ) @