STAT/MTHE 353: Probability II. STAT/MTHE 353: Multiple Random Variables. Review. Administrative details. Instructor: TamasLinder


 Edith Snow
 2 years ago
 Views:
Transcription
1 STAT/MTHE 353: Probability II STAT/MTHE 353: Multiple Random Variables Administrative details Instructor: TamasLinder T. Linder ueen s University Winter 2012 O ce: Je ery 401 Phone: O ce hours: Wednesday 2:30 3:30 pm Class web site: stat353 All homework and solutions will be posted here. Check frequently for new announcements STAT/MTHE 353: Multiple Random Variables 1/ 34 STAT/MTHE 353: Multiple Random Variables 2/ 34 Review Tet: Fundamentals of Probability with Stochastic Processes, 3rd ed., bys.ghahramani,prenticehall. Lecture slides will be posted on the class web site. The slides are not selfcontained; they only cover parts of the material. Homework: 9 HW assignments. Homework due Friday before noon in my mailbo (Je ery 401). No late homework will be accepted! Evaluation: the better of Homework 20%, midterm 20%, final eam 60% Homework 20%, final eam 80% Midterm Eam: Thursday, February 16 in class (9:3010:30 am) S is the sample space; P is a probability measure on S: P is a function from a collection of subsets of S (called the events) to [0, 1]. P satisfies the aioms of probability; A random variable is a function : S! R. Thedistribution of is the probability measure associated with : P ( 2 A) P ({s : (s) 2 A}), for any reasonable A R. STAT/MTHE 353: Multiple Random Variables 3/ 34 STAT/MTHE 353: Multiple Random Variables 4/ 34
2 Here are the usual ways to describe the distribution of : Distribution function: It is always well defined. F : R! [0, 1] defined by F () P ( apple ). Probability mass function, orpmf:if is a discrete random variable, thenitspmfp : R! [0, 1] is p () P ( ), for all 2 R. Note: :since is discrete, there is a countable set R such that p () 0if /2. Probability density function, orpdf:if is a continuous random variable, thenitspdff : R! [0, 1) is a function such that P ( 2 A) f() d for all reasonable A R. A Joint Distributions If 1,..., n are random variables (defined on the same probability space), we can think of ( 1,..., n ) T as a random vector. (In this course ( 1,..., n ) is a row vector and its transpose, ( 1,..., n ) T, is a column vector.) Thus is a function : S! R n. Distribution of : For reasonable A R n,wedefine P ( 2 A) P ({s : (s) 2 A}). is called a random vector or vector random variable. STAT/MTHE 353: Multiple Random Variables 5/ 34 STAT/MTHE 353: Multiple Random Variables 6/ 34 We usually describe the distribution of by a function on R n : Joint cumulative distribution function (jcdf) is a the function defined for ( 1,..., n ) 2 R n by F () F 1,..., n ( 1,..., n ) P ( 1 apple 1, 2 apple 2,..., n apple n ) P ({ 1 apple 1 } \ { 2 apple 2 } \ \ { n apple n }) P 2 n ( 1, i ] i1 If 1,..., n are all discrete random variables, then their joint probability mass function (jpmf) is p () P ( ) P ( 1 1, 2 2,..., n n ), 2 R n The finite or countable set of values such that p () > 0 is called the support of the distribution of. STAT/MTHE 353: Multiple Random Variables 7/ 34 Properties of joint pmf: (1) 0 apple p () apple 1 for all 2 R n. (2) 2 p () 1,where is the support of. If 1,..., n are continuous random variables and there eists f : R n! [0, 1) such that for any reasonable A R n, P ( 2 A) f ( 1,..., n ) d 1 d n then The 1,..., n are called jointly continuous; A f () f 1,..., n ( 1,..., n ) is called the joint probability density function (jpdf) of. STAT/MTHE 353: Multiple Random Variables 8/ 34
3 Properties of joint pdf: Comments: (a) The joint pdf can be redefined on any set in R n that has zero volume. This will not change the distribution of. (b) The joint pdf may not eists even when each 1,..., n are all (individually) continuous random variables. Eample:... (1) f () 0 for all 2 R n (2) f () d R n R n f ( 1,..., n ) d 1 d n 1 The distributions for the various subsets of { 1,..., n } can be recovered from the joint distribution. These distributions are called the joint marginal distributions (here marginal is relative to the full set { 1,..., n }. STAT/MTHE 353: Multiple Random Variables 9/ 34 STAT/MTHE 353: Multiple Random Variables 10 / 34 Marginal joint probability mass functions Assume 1,..., n are discrete. Let 0 <k<nand {i 1,...,i k } {1,...,n} Then the marginal joint pmf of ( i1,..., ik ) can be obtained from p () p 1,..., n ( 1,..., n ) as p i1,..., ik ( i1,..., ik ) P ( i1 i1,..., ik ik ) P ( i1 i1,..., ik ik, j1 2 R,..., jn k 2 R) where {j 1,...,j n k } {1,...,n}\{i 1,...,i k } p 1,..., n ( 1,..., n ) j1 jn k Eample: In an urn there are n i objects of type i for i 1,...,r.The total number of objects is n n r N. We randomly draw n objects (n apple N) without replacement. Let i #of objects of type i drawn. Find the joint pmf of ( 1,..., r ). Also find the marginal distribution of each i, i 1,...,r. Thus the joint pmf of i1,..., ik is obtained by summing p 1,..., n over all possible values of the complementary variables j1,..., jn k. STAT/MTHE 353: Multiple Random Variables 11 / 34 STAT/MTHE 353: Multiple Random Variables 12 / 34
4 Marginal joint probability density functions Let 1,..., n be jointly continuous with pdf f f. As before, let {i 1,...,i k } {1,...,n}, {j 1,...,j n k } {1,...,n}\{i 1,...,i k } Let B R k.then P ( i1,..., ik ) 2 B P ( i1,..., ik ) 2 B, j1 2 R,..., jn k 2 R 0 f( 1,..., n ) d j1 d jn A d k i1 d ik B R n k In conclusion, for {i 1,...,i k } {1,...,n}, f i1,..., ik ( i1,..., ik ) where {j 1,...,j n k } {1,...,n}\{i 1,...,i k }, R n k f( 1,..., n ) d j1 d jn k Note: In both the discrete and continuous cases it is important to always know where the joint pmf p and joint pdf f are zero and where they are positive. The latter set is called the support of p or f. That is, we integrate out the variables complementary to i1,..., ik. STAT/MTHE 353: Multiple Random Variables 13 / 34 STAT/MTHE 353: Multiple Random Variables 14 / 34 Eample: Suppose 1, 2, 3 are jointly continuous with jpdf 8 < 1 if 0 apple i apple 1, i 1, 2, 3 f( 1, 2, 3 ) : 0 otherwise Find the marginal pdfs of i, i 1, 2, 3, and the marginal jpdfs of ( i, j ), i 6 j. Eample: With 1, 2, 3 as in the previous problem, consider the quadratic equation 1 y y in the variable y. Find the probability that both roots are real. Marginal joint cumulative distribution functions In all cases (discrete, continuous, or mied), F i1,..., ik ( i1,..., ik ) P ( i1 apple i1,..., ik apple ik ) P ( i1 apple i1,..., ik apple ik, j1 < 1,..., jn k < 1) lim lim F 1,..., n ( 1,..., n ) j1!1 jn k!1 That is, we let the variables complementary to i1,..., ik converge to 1 STAT/MTHE 353: Multiple Random Variables 15 / 34 STAT/MTHE 353: Multiple Random Variables 16 / 34
5 Independence Definition The random variables 1,..., n are independent if for all reasonable A 1,...,A n R, P ( 1 2 A 1,..., n 2 A n )P ( 1 2 A 1 ) P ( n 2 A n ) Remarks: (i) Independence among 1,..., n usually arises in a probability model by assumption. Such an assumption is reasonable if the outcome of i has no e ect on the outcomes of the other j s. (ii) The also definition applies to any n random quantities 1,..., n. E.g., each i can itself be a vector r.v. In this case the A i s have to be appropriately modified. (iii) Suppose g i : R! R, i 1,...,n are reasonable functions. Then if 1,..., n are independent, then so are g 1 ( 1 ),...,g n ( n ). Proof For A 1,...,A n R, P g 1 ( 1 ) 2 A 1,...,g n ( n ) 2 A n P 1 2 g 1 1 (A 1),..., n 2 g 1 (A n ) where g 1 i (A i ){ i : g i ( i ) 2 A i } P 1 2 g 1 1 (A 1) P n 2 g 1 (A n ) P g 1 ( 1 ) 2 A 1 P g n ( n ) 2 A n ) Since the sets A i were arbitrary, we obtain that g 1 ( 1 ),...,g n ( n ) are independent. Note: If we only know that i and j are independent for all i 6 j, it does not follow that 1,..., n are independent. STAT/MTHE 353: Multiple Random Variables 17 / 34 STAT/MTHE 353: Multiple Random Variables 18 / 34 Independence and cdf, pmf, pdf Theorem 1 Let F be the joint cdf of the random variables 1,..., n.then 1,..., n are independent if and only if F is the product of the marginal cdfs of the i,i.e.,forall( 1,..., n ) 2 R n, F ( 1,..., n )F 1 ( 1 )F 2 ( 2 ) F n ( n ) Theorem 2 Let 1,..., n be discrete r.v. s with joint pmf p. Then 1,..., n are independent if and only if p is the product of the marginal pmfs of the i,i.e.,forall( 1,..., n ) 2 R n, p( 1,..., n )p 1 ( 1 )p 2 ( 2 ) p n ( n ) Proof: If 1,..., n are independent, then Proof: If 1,..., n are independent, then F ( 1,..., n ) P ( 1 apple 1, 2 apple 2,..., n apple n ) P ( 1 apple 1 )P ( 2 apple 2 ) P ( n apple n ) F 1 ( 1 )F 2 ( 2 ) F n ( n ) p( 1,..., n ) P ( 1 1, 2 2,..., n n ) P ( 1 1 )P ( 2 2 ) P ( n n ) p 1 ( 1 )p 2 ( 2 ) p n ( n ) The converse that F ( 1,..., n ) n F i ( i ) for all ( 1,..., n ) 2 R n i1 implies independence is out the the scope of this class. STAT/MTHE 353: Multiple Random Variables 19 / 34 STAT/MTHE 353: Multiple Random Variables 20 / 34
6 Proof cont d: Conversely, suppose that p( 1,..., n ) n p i ( i ) for any 1,..., n. Then, for any A 1,A 2,...,A n R, P ( 1 2 A 1,..., n 2 A n ) 12A 1 12A 1 p 1 ( 1 ) 12A 1! i1 p( 1,..., n ) n2a n p 1 ( 1 ) p n ( n ) n2a n!! p 2 ( 2 ) p n ( n ) 22A 2 n2a n P ( 1 2 A 1 )P ( 2 2 A 2 ) P ( n 2 A n ) Thus 1,..., n are independent. Theorem 3 Let 1,..., n be jointly continuous r.v. s with joint pdf f. Then 1,..., n are independent if and only if f is the product of the marginal pdfs of the i,i.e.,forall( 1,..., n ) 2 R n, f( 1,..., n )f 1 ( 1 )f 2 ( 2 ) f n ( n ). Proof: Assumef( 1,..., n ) n f i ( i ) for any 1,..., n.then for any A 1,A 2,...,A n R, P ( 1 2 A 1,..., n 2 A n ) i1 f( 1,..., n ) d 1 d n A 1 A n f 1 ( 1 ) f n ( n ) d 1 d n A 1 A n!! f 1 ( 1 ) d 1 f n ( n ) d n A 1 A n P ( 1 2 A 1 )P ( 2 2 A 2 ) P ( n 2 A n ) STAT/MTHE 353: Multiple Random Variables 21 / 34 so 1,..., n are independent. STAT/MTHE 353: Multiple Random Variables 22 / 34 Proof cont d: For the converse, note that F ( 1,..., n ) P ( 1 apple 1, 2 apple 2,..., n apple n ) 1 n 1 1 By the fundamental theorem of calculus f(t 1,...,t n ) dt 1 dt n F ( 1,..., n )f( 1,..., n ) Eample:... (assuming f is nice enough ). If 1,..., n are independent, then F ( 1,..., n ) n F i ( i ). n f( 1,..., n ) F ( 1,..., n n F 1 ( 1 ) F n ( n n f 1 ( 1 ) f n ( n ) i1 STAT/MTHE 353: Multiple Random Variables 23 / 34 STAT/MTHE 353: Multiple Random Variables 24 / 34
7 Epectations Involving Multiple Random Variables Recall that the epectation of a random variable is 8 >< p() E() 1 >: f() d 1 if is discrete if is continuous if the sum or the integral eist in the sense that P p() < 1 or R 1 f() d < 1. 1 Eample:... STAT/MTHE 353: Multiple Random Variables 25 / 34 If ( 1,..., n ) T is a random vector, we sometimes use the notation E() E( 1 ),...,E( n ) T For 1,..., n discrete, we still have E() P p() with the understanding that p() ( 1,..., n) ( 1,..., n ) T p( 1,..., n ) 1 p 1 ( 1 ), 2 p 2 ( 2 ),..., n p n ( n ) 1 2 n E( 1 ),...,E( n ) T Similarly, for jointly continuous 1,..., n, E() f() d R n ( 1,..., n ) T f( 1,..., n ) d 1 d n R n STAT/MTHE 353: Multiple Random Variables 26 / 34 T Theorem 4 ( Law of the unconscious statistician ) Suppose Y g() for some function g : R n! R k.then 8 >< g()p() E(Y ) >: g()f() d R n if 1,..., n are discrete if 1,..., n are jointly continuous Proof: We only prove the discrete case. Since ( 1,..., n ) can only take a countable number of values with positive probability, the same is true for (Y 1,...,Y k ) T Y g() so Y 1,...,Y k are discrete random variables. Proof cont d: Thus E(Y ) yp (Y y) yp (g() y) y y y P ( ) y :g()y g()p ( ) y :g()y g()p ( ) g()p() Eample: Linearity of epectation... E(a 0 + a a n n )a 0 + a 1 E( 1 )++ a n E( n ) STAT/MTHE 353: Multiple Random Variables 27 / 34 STAT/MTHE 353: Multiple Random Variables 28 / 34
8 Transformation of Multiple Random Variables Eample: Epected value of a binomial random variable... Eample: Suppose we have n bar magnets, each having negative polarity at one end and positive polarity at the other end. Line up the magnets endtoend in such a way that the orientation of each magnet is random (the two choices are equally likely independently of the others). On the average, how many segments of magnets that stick together do we obtain? Suppose 1,..., n are jointly continuous with joint pdf f( 1,..., n ). Let h : R n! R n be a continuously di erentiable and onetoone (invertible) function whose inverse g is also continuously di erentiable. Thus h is given by ( 1,..., n ) 7! (y 1,...,y n ),where y 1 h 1 ( 1,..., n ), y 2 h 2 ( 1,..., n ),..., y n h n ( 1,..., n ) We want to find the joint pdf of the vector Y (Y 1,...,Y n ) T, where Y 1 h 1 ( 1,..., n ) Y 2 h 2 ( 1,..., n ). Y n h n ( 1,..., n ) STAT/MTHE 353: Multiple Random Variables 29 / 34 STAT/MTHE 353: Multiple Random Variables 30 / 34 Let g : R n! R n denote the inverse of h. LetB R n be a nice set. We have P (Y 1,...,Y n ) 2 B P h() 2 B P ( 1,..., n ) 2 A where A { 2 R n : h() 2 B} h 1 (B) g(b). The multivariate change of variables formula for g(y) implies that P ( 1,..., n ) 2 A f( 1,..., n ) d 1 d n B B g(b) f g 1 (y 1,...,y n ),...,g n (y 1,...,y n ) J g (y 1,...,y n ) dy 1 dy n f(g(y)) J g (y) dy We have shown that for nice B R n P (Y 1,...,Y n ) 2 B This implies the following: B f(g(y)) J g (y) dy Theorem 5 (Transformation of Multiple Random Variables) Suppose 1,..., n are jointly continuous with joint pdf f( 1,..., n ). Let h : R n! R n be a continuously di erentiable and onetoone function with continuously di erentiable inverse g. Then the joint pdf of Y (Y 1,...,Y n ) T h() is f Y (y 1,...,y n )f g 1 (y 1,...,y n ),...,g n (y 1,...,y n ) J g (y 1,...,y n ). where J g is the Jacobian of the transformation g. STAT/MTHE 353: Multiple Random Variables 31 / 34 STAT/MTHE 353: Multiple Random Variables 32 / 34
9 Eample: Suppose ( 1,..., n ) T has joint pdf f and let Y A, wherea is an invertible n n (real) matri. Find f Y. Often we are interested in the pdf of just a single function of 1,..., n,sayy 1 h 1 ( 1,..., n ). (1) Define Y i h i ( 1,..., n ), i 2,...,n in such a way that the mapping h (h 1,...,h n ) satisfies the conditions of the theorem (h has an inverse g which is continuously di erentiable). Then the theorem gives the joint pdf f Y (y 1,...,y n ) and we obtain f Y1 (y 1 ) by integrating out y 2,...,y n : f Y1 (y 1 ) f Y (y 1,...,y n ) dy 2...dy n R n 1 (2) Often it is easier to directly compute the cdf of Y 1 : F Y1 (y) P (Y 1 apple y) P h 1 ( 1,..., n ) apple y P ( 1,..., n ) 2 A y where A y {( 1,..., n ):h( 1,..., n ) apple y} f( 1,..., n ) d 1 d n A y Di erentiating F Y1 we obtain the pdf of Y 1. Eample: Let 1,..., n be independent with common distribution Uniform(0, 1). Determine the pdf of Y min( 1,..., n ). A common choice is Y i i, i 2,...,n. STAT/MTHE 353: Multiple Random Variables 33 / 34 STAT/MTHE 353: Multiple Random Variables 34 / 34
LECTURE 1 I. Inverse matrices We return now to the problem of solving linear equations. Recall that we are trying to find x such that IA = A
LECTURE I. Inverse matrices We return now to the problem of solving linear equations. Recall that we are trying to find such that A = y. Recall: there is a matri I such that for all R n. It follows that
More informationSection 5: The Jacobian matrix and applications. S1: Motivation S2: Jacobian matrix + differentiability S3: The chain rule S4: Inverse functions
Section 5: The Jacobian matri and applications. S1: Motivation S2: Jacobian matri + differentiabilit S3: The chain rule S4: Inverse functions Images from Thomas calculus b Thomas, Wier, Hass & Giordano,
More informationRandom variables, probability distributions, binomial random variable
Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that
More informationChange of Continuous Random Variable
Change of Continuous Random Variable All you are responsible for from this lecture is how to implement the Engineer s Way (see page 4) to compute how the probability density function changes when we make
More informationContinuous Random Variables
Continuous Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Continuous Random Variables 2 11 Introduction 2 12 Probability Density Functions 3 13 Transformations 5 2 Mean, Variance and Quantiles
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete
More information6. Jointly Distributed Random Variables
6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.
More informationLecture 6: Discrete & Continuous Probability and Random Variables
Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September
More informationChapter 4  Lecture 1 Probability Density Functions and Cumul. Distribution Functions
Chapter 4  Lecture 1 Probability Density Functions and Cumulative Distribution Functions October 21st, 2009 Review Probability distribution function Useful results Relationship between the pdf and the
More informationWorked examples Multiple Random Variables
Worked eamples Multiple Random Variables Eample Let X and Y be random variables that take on values from the set,, } (a) Find a joint probability mass assignment for which X and Y are independent, and
More informationMath 431 An Introduction to Probability. Final Exam Solutions
Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <
More informationConditional expectation
A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined
More information160 CHAPTER 4. VECTOR SPACES
160 CHAPTER 4. VECTOR SPACES 4. Rank and Nullity In this section, we look at relationships between the row space, column space, null space of a matrix and its transpose. We will derive fundamental results
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 101634501 Probability and Statistics for Engineers Winter 20102011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete
More informationChapter 3 Joint Distributions
Chapter 3 Joint Distributions 3.6 Functions of Jointly Distributed Random Variables Discrete Random Variables: Let f(x, y) denote the joint pdf of random variables X and Y with A denoting the twodimensional
More informationMath 115 Spring 2014 Written Homework 3 Due Wednesday, February 19
Math 11 Spring 01 Written Homework 3 Due Wednesday, February 19 Instructions: Write complete solutions on separate paper (not spiral bound). If multiple pieces of paper are used, they must be stapled with
More information1. Please write your name in the blank above, and sign & date below. 2. Please use the space provided to write your solution.
Name : Instructor: Marius Ionescu Instructions: 1. Please write your name in the blank above, and sign & date below. 2. Please use the space provided to write your solution. 3. If you need extra pages
More informationMath 315: Linear Algebra Solutions to Midterm Exam I
Math 35: Linear Algebra s to Midterm Exam I # Consider the following two systems of linear equations (I) ax + by = k cx + dy = l (II) ax + by = 0 cx + dy = 0 (a) Prove: If x = x, y = y and x = x 2, y =
More informationM2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung
M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS
More informationLecture 11. Shuanglin Shao. October 2nd and 7th, 2013
Lecture 11 Shuanglin Shao October 2nd and 7th, 2013 Matrix determinants: addition. Determinants: multiplication. Adjoint of a matrix. Cramer s rule to solve a linear system. Recall that from the previous
More informationLecture 10: Invertible matrices. Finding the inverse of a matrix
Lecture 10: Invertible matrices. Finding the inverse of a matrix Danny W. Crytser April 11, 2014 Today s lecture Today we will Today s lecture Today we will 1 Single out a class of especially nice matrices
More informationExample. Two fair dice are tossed and the two outcomes recorded. As is familiar, we have
Lectures 910 jacques@ucsd.edu 5.1 Random Variables Let (Ω, F, P ) be a probability space. The Borel sets in R are the sets in the smallest σ field on R that contains all countable unions and complements
More informationST 371 (VIII): Theory of Joint Distributions
ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or
More informationFEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL
FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint
More informationMath 313 Lecture #10 2.2: The Inverse of a Matrix
Math 1 Lecture #10 2.2: The Inverse of a Matrix Matrix algebra provides tools for creating many useful formulas just like real number algebra does. For example, a real number a is invertible if there is
More information3.8 Finding Antiderivatives; Divergence and Curl of a Vector Field
3.8 Finding Antiderivatives; Divergence and Curl of a Vector Field 77 3.8 Finding Antiderivatives; Divergence and Curl of a Vector Field Overview: The antiderivative in one variable calculus is an important
More informationExercises with solutions (1)
Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal
More informationDefinition: Suppose that two random variables, either continuous or discrete, X and Y have joint density
HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,
More informationCofactor Expansion: Cramer s Rule
Cofactor Expansion: Cramer s Rule MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015 Introduction Today we will focus on developing: an efficient method for calculating
More information1 Maximizing pro ts when marginal costs are increasing
BEE12 Basic Mathematical Economics Week 1, Lecture Tuesda 12.1. Pro t maimization 1 Maimizing pro ts when marginal costs are increasing We consider in this section a rm in a perfectl competitive market
More informationChapter 5: Multivariate Distributions
Chapter 5: Multivariate Distributions Professor Ron Fricker Naval Postgraduate School Monterey, California 3/15/15 Reading Assignment: Sections 5.1 5.12 1 Goals for this Chapter Bivariate and multivariate
More informationJointly Distributed Random Variables
Jointly Distributed Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Jointly Distributed Random Variables 1 1.1 Definition......................................... 1 1.2 Joint cdfs..........................................
More informationSection 6.1 Joint Distribution Functions
Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function
More informationProbability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur
Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special DistributionsVI Today, I am going to introduce
More information3. Recurrence Recursive Definitions. To construct a recursively defined function:
3. RECURRENCE 10 3. Recurrence 3.1. Recursive Definitions. To construct a recursively defined function: 1. Initial Condition(s) (or basis): Prescribe initial value(s) of the function.. Recursion: Use a
More informationMath 312 Homework 1 Solutions
Math 31 Homework 1 Solutions Last modified: July 15, 01 This homework is due on Thursday, July 1th, 01 at 1:10pm Please turn it in during class, or in my mailbox in the main math office (next to 4W1) Please
More informationJoint Probability Distributions and Random Samples. Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage
5 Joint Probability Distributions and Random Samples Week 5, 2011 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Two Discrete Random Variables The probability mass function (pmf) of a single
More informationLecture Note on Linear Algebra 15. Dimension and Rank
Lecture Note on Linear Algebra 15. Dimension and Rank WeiShi Zheng, wszheng@ieee.org, 211 November 1, 211 1 What Do You Learn from This Note We still observe the unit vectors we have introduced in Chapter
More informationContents. TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics. Yuming Jiang. Basic Concepts Random Variables
TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics Yuming Jiang 1 Some figures taken from the web. Contents Basic Concepts Random Variables Discrete Random Variables Continuous Random
More informationSF2940: Probability theory Lecture 8: Multivariate Normal Distribution
SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,
More information{f 1 (U), U F} is an open cover of A. Since A is compact there is a finite subcover of A, {f 1 (U 1 ),...,f 1 (U n )}, {U 1,...
44 CHAPTER 4. CONTINUOUS FUNCTIONS In Calculus we often use arithmetic operations to generate new continuous functions from old ones. In a general metric space we don t have arithmetic, but much of it
More informationProbability and Random Variables. Generation of random variables (r.v.)
Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly
More informationQuadratic forms Cochran s theorem, degrees of freedom, and all that
Quadratic forms Cochran s theorem, degrees of freedom, and all that Dr. Frank Wood Frank Wood, fwood@stat.columbia.edu Linear Regression Models Lecture 1, Slide 1 Why We Care Cochran s theorem tells us
More informationMATH10212 Linear Algebra. Systems of Linear Equations. Definition. An ndimensional vector is a row or a column of n numbers (or letters): a 1.
MATH10212 Linear Algebra Textbook: D. Poole, Linear Algebra: A Modern Introduction. Thompson, 2006. ISBN 0534405967. Systems of Linear Equations Definition. An ndimensional vector is a row or a column
More informationRandom variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.
Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationChapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a realvalued function defined on the sample space of some experiment. For instance,
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationMT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo. 3 MT426 Notebook 3 3. 3.1 Definitions... 3. 3.2 Joint Discrete Distributions...
MT426 Notebook 3 Fall 2012 prepared by Professor Jenny Baglivo c Copyright 20042012 by Jenny A. Baglivo. All Rights Reserved. Contents 3 MT426 Notebook 3 3 3.1 Definitions............................................
More informationL10: Probability, statistics, and estimation theory
L10: Probability, statistics, and estimation theory Review of probability theory Bayes theorem Statistics and the Normal distribution Least Squares Error estimation Maximum Likelihood estimation Bayesian
More informationTransformations and Expectations of random variables
Transformations and Epectations of random variables X F X (): a random variable X distributed with CDF F X. Any function Y = g(x) is also a random variable. If both X, and Y are continuous random variables,
More informationDefinition of Random Variable A random variable is a function from a sample space S into the real numbers.
.4 Random Variable Motivation example In an opinion poll, we might decide to ask 50 people whether they agree or disagree with a certain issue. If we record a for agree and 0 for disagree, the sample space
More information( ) is proportional to ( 10 + x)!2. Calculate the
PRACTICE EXAMINATION NUMBER 6. An insurance company eamines its pool of auto insurance customers and gathers the following information: i) All customers insure at least one car. ii) 64 of the customers
More informationCITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION
No: CITY UNIVERSITY LONDON BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION ENGINEERING MATHEMATICS 2 (resit) EX2005 Date: August
More informationPractice with Proofs
Practice with Proofs October 6, 2014 Recall the following Definition 0.1. A function f is increasing if for every x, y in the domain of f, x < y = f(x) < f(y) 1. Prove that h(x) = x 3 is increasing, using
More informationUniversity of California, Berkeley, Statistics 134: Concepts of Probability
University of California, Berkeley, Statistics 134: Concepts of Probability Michael Lugo, Spring 211 Exam 2 solutions 1. A fair twentysided die has its faces labeled 1, 2, 3,..., 2. The die is rolled
More informationChapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 4: Geometric Distribution Negative Binomial Distribution Hypergeometric Distribution Sections 37, 38 The remaining discrete random
More information6. Distribution and Quantile Functions
Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 6. Distribution and Quantile Functions As usual, our starting point is a random experiment with probability measure P on an underlying sample spac
More informationST 371 (IV): Discrete Random Variables
ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible
More informationLinear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)
MAT067 University of California, Davis Winter 2007 Linear Maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007) As we have discussed in the lecture on What is Linear Algebra? one of
More informationReview Solutions, Exam 3, Math 338
Review Solutions, Eam 3, Math 338. Define a random sample: A random sample is a collection of random variables, typically indeed as X,..., X n.. Why is the t distribution used in association with sampling?
More informationContinued Fractions and the Euclidean Algorithm
Continued Fractions and the Euclidean Algorithm Lecture notes prepared for MATH 326, Spring 997 Department of Mathematics and Statistics University at Albany William F Hammond Table of Contents Introduction
More information10.2 Series and Convergence
10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and
More informationTRANSFORMATIONS OF RANDOM VARIABLES
TRANSFORMATIONS OF RANDOM VARIABLES 1. INTRODUCTION 1.1. Definition. We are often interested in the probability distributions or densities of functions of one or more random variables. Suppose we have
More informationChapter 5: Joint Probability Distributions. Chapter Learning Objectives. The Joint Probability Distribution for a Pair of Discrete Random
Chapter 5: Joint Probability Distributions 51 Two or More Random Variables 51.1 Joint Probability Distributions 51.2 Marginal Probability Distributions 51.3 Conditional Probability Distributions 51.4
More informationProblem Set. Problem Set #2. Math 5322, Fall December 3, 2001 ANSWERS
Problem Set Problem Set #2 Math 5322, Fall 2001 December 3, 2001 ANSWERS i Problem 1. [Problem 18, page 32] Let A P(X) be an algebra, A σ the collection of countable unions of sets in A, and A σδ the collection
More informationSF2940: Probability theory Lecture 8: Multivariate Normal Distribution
SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2014 Timo Koski () Mathematisk statistik 24.09.2014 1 / 75 Learning outcomes Random vectors, mean vector, covariance
More information4.6 Null Space, Column Space, Row Space
NULL SPACE, COLUMN SPACE, ROW SPACE Null Space, Column Space, Row Space In applications of linear algebra, subspaces of R n typically arise in one of two situations: ) as the set of solutions of a linear
More informationMATH 304 Linear Algebra Lecture 8: Inverse matrix (continued). Elementary matrices. Transpose of a matrix.
MATH 304 Linear Algebra Lecture 8: Inverse matrix (continued). Elementary matrices. Transpose of a matrix. Inverse matrix Definition. Let A be an n n matrix. The inverse of A is an n n matrix, denoted
More informationExamples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf
AMS 3 Joe Mitchell Eamples: Joint Densities and Joint Mass Functions Eample : X and Y are jointl continuous with joint pdf f(,) { c 2 + 3 if, 2, otherwise. (a). Find c. (b). Find P(X + Y ). (c). Find marginal
More informationSenior Secondary Australian Curriculum
Senior Secondary Australian Curriculum Mathematical Methods Glossary Unit 1 Functions and graphs Asymptote A line is an asymptote to a curve if the distance between the line and the curve approaches zero
More informationBasics Inversion and related concepts Random vectors Matrix calculus. Matrix algebra. Patrick Breheny. January 20
Matrix algebra January 20 Introduction Basics The mathematics of multiple regression revolves around ordering and keeping track of large arrays of numbers and solving systems of equations The mathematical
More informationEC 6310: Advanced Econometric Theory
EC 6310: Advanced Econometric Theory July 2008 Slides for Lecture on Bayesian Computation in the Nonlinear Regression Model Gary Koop, University of Strathclyde 1 Summary Readings: Chapter 5 of textbook.
More informationMath 461 Fall 2006 Test 2 Solutions
Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two
More informationProbability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0
Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall
More informationLecture 7: Continuous Random Variables
Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider
More informationCURVE FITTING LEAST SQUARES APPROXIMATION
CURVE FITTING LEAST SQUARES APPROXIMATION Data analysis and curve fitting: Imagine that we are studying a physical system involving two quantities: x and y Also suppose that we expect a linear relationship
More informationα = u v. In other words, Orthogonal Projection
Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v
More informationFunctions: Piecewise, Even and Odd.
Functions: Piecewise, Even and Odd. MA161/MA1161: Semester 1 Calculus. Prof. Götz Pfeiffer School of Mathematics, Statistics and Applied Mathematics NUI Galway September 2122, 2015 Tutorials, Online Homework.
More informationA Note on Di erential Calculus in R n by James Hebda August 2010.
A Note on Di erential Calculus in n by James Hebda August 2010 I. Partial Derivatives o Functions Let : U! be a real valued unction deined in an open neighborhood U o the point a =(a 1,...,a n ) in the
More information4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns
L. Vandenberghe EE133A (Spring 2016) 4. Matrix inverses left and right inverse linear independence nonsingular matrices matrices with linearly independent columns matrices with linearly independent rows
More informationPrinciple of Data Reduction
Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then
More informationProbability and Statistics
CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b  0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute  Systems and Modeling GIGA  Bioinformatics ULg kristel.vansteen@ulg.ac.be
More informationTopic 4: Multivariate random variables. Multiple random variables
Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly
More informationLinear Dependence Tests
Linear Dependence Tests The book omits a few key tests for checking the linear dependence of vectors. These short notes discuss these tests, as well as the reasoning behind them. Our first test checks
More information2.1 Functions. 2.1 J.A.Beachy 1. from A Study Guide for Beginner s by J.A.Beachy, a supplement to Abstract Algebra by Beachy / Blair
2.1 J.A.Beachy 1 2.1 Functions from A Study Guide for Beginner s by J.A.Beachy, a supplement to Abstract Algebra by Beachy / Blair 21. The Vertical Line Test from calculus says that a curve in the xyplane
More informationDefinition A square matrix M is invertible (or nonsingular) if there exists a matrix M 1 such that
0. Inverse Matrix Definition A square matrix M is invertible (or nonsingular) if there exists a matrix M such that M M = I = M M. Inverse of a 2 2 Matrix Let M and N be the matrices: a b d b M =, N = c
More informationNOTES ON LINEAR TRANSFORMATIONS
NOTES ON LINEAR TRANSFORMATIONS Definition 1. Let V and W be vector spaces. A function T : V W is a linear transformation from V to W if the following two properties hold. i T v + v = T v + T v for all
More informationContinuous Distributions
MAT 2379 3X (Summer 2012) Continuous Distributions Up to now we have been working with discrete random variables whose R X is finite or countable. However we will have to allow for variables that can take
More informationIntroduction to Flocking {Stochastic Matrices}
Supelec EECI Graduate School in Control Introduction to Flocking {Stochastic Matrices} A. S. Morse Yale University Gif sur  Yvette May 21, 2012 CRAIG REYNOLDS  1987 BOIDS The Lion King CRAIG REYNOLDS
More informationYou flip a fair coin four times, what is the probability that you obtain three heads.
Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.
More informationChapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution
Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 35, 36 Special discrete random variable distributions we will cover
More informationHOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!
Math 7 Fall 205 HOMEWORK 5 SOLUTIONS Problem. 2008 B2 Let F 0 x = ln x. For n 0 and x > 0, let F n+ x = 0 F ntdt. Evaluate n!f n lim n ln n. By directly computing F n x for small n s, we obtain the following
More informationTI84/83 graphing calculator
TI8/83 graphing calculator Let us use the table feature to approimate limits and then solve the problems the old fashion way. Find the following limits: Eample 1.1 lim f () Turn on your TI8 graphing
More informationF Matrix Calculus F 1
F Matrix Calculus F 1 Appendix F: MATRIX CALCULUS TABLE OF CONTENTS Page F1 Introduction F 3 F2 The Derivatives of Vector Functions F 3 F21 Derivative of Vector with Respect to Vector F 3 F22 Derivative
More informationECEN 5682 Theory and Practice of Error Control Codes
ECEN 5682 Theory and Practice of Error Control Codes Convolutional Codes University of Colorado Spring 2007 Linear (n, k) block codes take k data symbols at a time and encode them into n code symbols.
More informationSets and Counting. Let A and B be two sets. It is easy to see (from the diagram above) that A B = A + B A B
Sets and Counting Let us remind that the integer part of a number is the greatest integer that is less or equal to. It is denoted by []. Eample [3.1] = 3, [.76] = but [ 3.1] = 4 and [.76] = 6 A B Let A
More informationminimal polyonomial Example
Minimal Polynomials Definition Let α be an element in GF(p e ). We call the monic polynomial of smallest degree which has coefficients in GF(p) and α as a root, the minimal polyonomial of α. Example: We
More informationRandom Variable: A function that assigns numerical values to all the outcomes in the sample space.
STAT 509 Section 3.2: Discrete Random Variables Random Variable: A function that assigns numerical values to all the outcomes in the sample space. Notation: Capital letters (like Y) denote a random variable.
More informationMicroeconomic Theory: Basic Math Concepts
Microeconomic Theory: Basic Math Concepts Matt Van Essen University of Alabama Van Essen (U of A) Basic Math Concepts 1 / 66 Basic Math Concepts In this lecture we will review some basic mathematical concepts
More information