STAT/MTHE 353: Probability II. STAT/MTHE 353: Multiple Random Variables. Review. Administrative details. Instructor: TamasLinder
|
|
|
- Edith Snow
- 9 years ago
- Views:
Transcription
1 STAT/MTHE 353: Probability II STAT/MTHE 353: Multiple Random Variables Administrative details Instructor: TamasLinder T. Linder ueen s University Winter 2012 O ce: Je ery 401 Phone: O ce hours: Wednesday 2:30 3:30 pm Class web site: stat353 All homework and solutions will be posted here. Check frequently for new announcements STAT/MTHE 353: Multiple Random Variables 1/ 34 STAT/MTHE 353: Multiple Random Variables 2/ 34 Review Tet: Fundamentals of Probability with Stochastic Processes, 3rd ed., bys.ghahramani,prenticehall. Lecture slides will be posted on the class web site. The slides are not self-contained; they only cover parts of the material. Homework: 9 HW assignments. Homework due Friday before noon in my mailbo (Je ery 401). No late homework will be accepted! Evaluation: the better of Homework 20%, midterm 20%, final eam 60% Homework 20%, final eam 80% Midterm Eam: Thursday, February 16 in class (9:30-10:30 am) S is the sample space; P is a probability measure on S: P is a function from a collection of subsets of S (called the events) to [0, 1]. P satisfies the aioms of probability; A random variable is a function : S! R. Thedistribution of is the probability measure associated with : P ( 2 A) P ({s : (s) 2 A}), for any reasonable A R. STAT/MTHE 353: Multiple Random Variables 3/ 34 STAT/MTHE 353: Multiple Random Variables 4/ 34
2 Here are the usual ways to describe the distribution of : Distribution function: It is always well defined. F : R! [0, 1] defined by F () P ( apple ). Probability mass function, orpmf:if is a discrete random variable, thenitspmfp : R! [0, 1] is p () P ( ), for all 2 R. Note: :since is discrete, there is a countable set R such that p () 0if /2. Probability density function, orpdf:if is a continuous random variable, thenitspdff : R! [0, 1) is a function such that P ( 2 A) f() d for all reasonable A R. A Joint Distributions If 1,..., n are random variables (defined on the same probability space), we can think of ( 1,..., n ) T as a random vector. (In this course ( 1,..., n ) is a row vector and its transpose, ( 1,..., n ) T, is a column vector.) Thus is a function : S! R n. Distribution of : For reasonable A R n,wedefine P ( 2 A) P ({s : (s) 2 A}). is called a random vector or vector random variable. STAT/MTHE 353: Multiple Random Variables 5/ 34 STAT/MTHE 353: Multiple Random Variables 6/ 34 We usually describe the distribution of by a function on R n : Joint cumulative distribution function (jcdf) is a the function defined for ( 1,..., n ) 2 R n by F () F 1,..., n ( 1,..., n ) P ( 1 apple 1, 2 apple 2,..., n apple n ) P ({ 1 apple 1 } \ { 2 apple 2 } \ \ { n apple n }) P 2 n ( 1, i ] i1 If 1,..., n are all discrete random variables, then their joint probability mass function (jpmf) is p () P ( ) P ( 1 1, 2 2,..., n n ), 2 R n The finite or countable set of values such that p () > 0 is called the support of the distribution of. STAT/MTHE 353: Multiple Random Variables 7/ 34 Properties of joint pmf: (1) 0 apple p () apple 1 for all 2 R n. (2) 2 p () 1,where is the support of. If 1,..., n are continuous random variables and there eists f : R n! [0, 1) such that for any reasonable A R n, P ( 2 A) f ( 1,..., n ) d 1 d n then The 1,..., n are called jointly continuous; A f () f 1,..., n ( 1,..., n ) is called the joint probability density function (jpdf) of. STAT/MTHE 353: Multiple Random Variables 8/ 34
3 Properties of joint pdf: Comments: (a) The joint pdf can be redefined on any set in R n that has zero volume. This will not change the distribution of. (b) The joint pdf may not eists even when each 1,..., n are all (individually) continuous random variables. Eample:... (1) f () 0 for all 2 R n (2) f () d R n R n f ( 1,..., n ) d 1 d n 1 The distributions for the various subsets of { 1,..., n } can be recovered from the joint distribution. These distributions are called the joint marginal distributions (here marginal is relative to the full set { 1,..., n }. STAT/MTHE 353: Multiple Random Variables 9/ 34 STAT/MTHE 353: Multiple Random Variables 10 / 34 Marginal joint probability mass functions Assume 1,..., n are discrete. Let 0 <k<nand {i 1,...,i k } {1,...,n} Then the marginal joint pmf of ( i1,..., ik ) can be obtained from p () p 1,..., n ( 1,..., n ) as p i1,..., ik ( i1,..., ik ) P ( i1 i1,..., ik ik ) P ( i1 i1,..., ik ik, j1 2 R,..., jn k 2 R) where {j 1,...,j n k } {1,...,n}\{i 1,...,i k } p 1,..., n ( 1,..., n ) j1 jn k Eample: In an urn there are n i objects of type i for i 1,...,r.The total number of objects is n n r N. We randomly draw n objects (n apple N) without replacement. Let i #of objects of type i drawn. Find the joint pmf of ( 1,..., r ). Also find the marginal distribution of each i, i 1,...,r. Thus the joint pmf of i1,..., ik is obtained by summing p 1,..., n over all possible values of the complementary variables j1,..., jn k. STAT/MTHE 353: Multiple Random Variables 11 / 34 STAT/MTHE 353: Multiple Random Variables 12 / 34
4 Marginal joint probability density functions Let 1,..., n be jointly continuous with pdf f f. As before, let {i 1,...,i k } {1,...,n}, {j 1,...,j n k } {1,...,n}\{i 1,...,i k } Let B R k.then P ( i1,..., ik ) 2 B P ( i1,..., ik ) 2 B, j1 2 R,..., jn k 2 R 0 f( 1,..., n ) d j1 d jn A d k i1 d ik B R n k In conclusion, for {i 1,...,i k } {1,...,n}, f i1,..., ik ( i1,..., ik ) where {j 1,...,j n k } {1,...,n}\{i 1,...,i k }, R n k f( 1,..., n ) d j1 d jn k Note: In both the discrete and continuous cases it is important to always know where the joint pmf p and joint pdf f are zero and where they are positive. The latter set is called the support of p or f. That is, we integrate out the variables complementary to i1,..., ik. STAT/MTHE 353: Multiple Random Variables 13 / 34 STAT/MTHE 353: Multiple Random Variables 14 / 34 Eample: Suppose 1, 2, 3 are jointly continuous with jpdf 8 < 1 if 0 apple i apple 1, i 1, 2, 3 f( 1, 2, 3 ) : 0 otherwise Find the marginal pdfs of i, i 1, 2, 3, and the marginal jpdfs of ( i, j ), i 6 j. Eample: With 1, 2, 3 as in the previous problem, consider the quadratic equation 1 y y in the variable y. Find the probability that both roots are real. Marginal joint cumulative distribution functions In all cases (discrete, continuous, or mied), F i1,..., ik ( i1,..., ik ) P ( i1 apple i1,..., ik apple ik ) P ( i1 apple i1,..., ik apple ik, j1 < 1,..., jn k < 1) lim lim F 1,..., n ( 1,..., n ) j1!1 jn k!1 That is, we let the variables complementary to i1,..., ik converge to 1 STAT/MTHE 353: Multiple Random Variables 15 / 34 STAT/MTHE 353: Multiple Random Variables 16 / 34
5 Independence Definition The random variables 1,..., n are independent if for all reasonable A 1,...,A n R, P ( 1 2 A 1,..., n 2 A n )P ( 1 2 A 1 ) P ( n 2 A n ) Remarks: (i) Independence among 1,..., n usually arises in a probability model by assumption. Such an assumption is reasonable if the outcome of i has no e ect on the outcomes of the other j s. (ii) The also definition applies to any n random quantities 1,..., n. E.g., each i can itself be a vector r.v. In this case the A i s have to be appropriately modified. (iii) Suppose g i : R! R, i 1,...,n are reasonable functions. Then if 1,..., n are independent, then so are g 1 ( 1 ),...,g n ( n ). Proof For A 1,...,A n R, P g 1 ( 1 ) 2 A 1,...,g n ( n ) 2 A n P 1 2 g 1 1 (A 1),..., n 2 g 1 (A n ) where g 1 i (A i ){ i : g i ( i ) 2 A i } P 1 2 g 1 1 (A 1) P n 2 g 1 (A n ) P g 1 ( 1 ) 2 A 1 P g n ( n ) 2 A n ) Since the sets A i were arbitrary, we obtain that g 1 ( 1 ),...,g n ( n ) are independent. Note: If we only know that i and j are independent for all i 6 j, it does not follow that 1,..., n are independent. STAT/MTHE 353: Multiple Random Variables 17 / 34 STAT/MTHE 353: Multiple Random Variables 18 / 34 Independence and cdf, pmf, pdf Theorem 1 Let F be the joint cdf of the random variables 1,..., n.then 1,..., n are independent if and only if F is the product of the marginal cdfs of the i,i.e.,forall( 1,..., n ) 2 R n, F ( 1,..., n )F 1 ( 1 )F 2 ( 2 ) F n ( n ) Theorem 2 Let 1,..., n be discrete r.v. s with joint pmf p. Then 1,..., n are independent if and only if p is the product of the marginal pmfs of the i,i.e.,forall( 1,..., n ) 2 R n, p( 1,..., n )p 1 ( 1 )p 2 ( 2 ) p n ( n ) Proof: If 1,..., n are independent, then Proof: If 1,..., n are independent, then F ( 1,..., n ) P ( 1 apple 1, 2 apple 2,..., n apple n ) P ( 1 apple 1 )P ( 2 apple 2 ) P ( n apple n ) F 1 ( 1 )F 2 ( 2 ) F n ( n ) p( 1,..., n ) P ( 1 1, 2 2,..., n n ) P ( 1 1 )P ( 2 2 ) P ( n n ) p 1 ( 1 )p 2 ( 2 ) p n ( n ) The converse that F ( 1,..., n ) n F i ( i ) for all ( 1,..., n ) 2 R n i1 implies independence is out the the scope of this class. STAT/MTHE 353: Multiple Random Variables 19 / 34 STAT/MTHE 353: Multiple Random Variables 20 / 34
6 Proof cont d: Conversely, suppose that p( 1,..., n ) n p i ( i ) for any 1,..., n. Then, for any A 1,A 2,...,A n R, P ( 1 2 A 1,..., n 2 A n ) 12A 1 12A 1 p 1 ( 1 ) 12A 1! i1 p( 1,..., n ) n2a n p 1 ( 1 ) p n ( n ) n2a n!! p 2 ( 2 ) p n ( n ) 22A 2 n2a n P ( 1 2 A 1 )P ( 2 2 A 2 ) P ( n 2 A n ) Thus 1,..., n are independent. Theorem 3 Let 1,..., n be jointly continuous r.v. s with joint pdf f. Then 1,..., n are independent if and only if f is the product of the marginal pdfs of the i,i.e.,forall( 1,..., n ) 2 R n, f( 1,..., n )f 1 ( 1 )f 2 ( 2 ) f n ( n ). Proof: Assumef( 1,..., n ) n f i ( i ) for any 1,..., n.then for any A 1,A 2,...,A n R, P ( 1 2 A 1,..., n 2 A n ) i1 f( 1,..., n ) d 1 d n A 1 A n f 1 ( 1 ) f n ( n ) d 1 d n A 1 A n!! f 1 ( 1 ) d 1 f n ( n ) d n A 1 A n P ( 1 2 A 1 )P ( 2 2 A 2 ) P ( n 2 A n ) STAT/MTHE 353: Multiple Random Variables 21 / 34 so 1,..., n are independent. STAT/MTHE 353: Multiple Random Variables 22 / 34 Proof cont d: For the converse, note that F ( 1,..., n ) P ( 1 apple 1, 2 apple 2,..., n apple n ) 1 n 1 1 By the fundamental theorem of calculus f(t 1,...,t n ) dt 1 dt n F ( 1,..., n )f( 1,..., n ) Eample:... (assuming f is nice enough ). If 1,..., n are independent, then F ( 1,..., n ) n F i ( i ). n f( 1,..., n ) F ( 1,..., n n F 1 ( 1 ) F n ( n n f 1 ( 1 ) f n ( n ) i1 STAT/MTHE 353: Multiple Random Variables 23 / 34 STAT/MTHE 353: Multiple Random Variables 24 / 34
7 Epectations Involving Multiple Random Variables Recall that the epectation of a random variable is 8 >< p() E() 1 >: f() d 1 if is discrete if is continuous if the sum or the integral eist in the sense that P p() < 1 or R 1 f() d < 1. 1 Eample:... STAT/MTHE 353: Multiple Random Variables 25 / 34 If ( 1,..., n ) T is a random vector, we sometimes use the notation E() E( 1 ),...,E( n ) T For 1,..., n discrete, we still have E() P p() with the understanding that p() ( 1,..., n) ( 1,..., n ) T p( 1,..., n ) 1 p 1 ( 1 ), 2 p 2 ( 2 ),..., n p n ( n ) 1 2 n E( 1 ),...,E( n ) T Similarly, for jointly continuous 1,..., n, E() f() d R n ( 1,..., n ) T f( 1,..., n ) d 1 d n R n STAT/MTHE 353: Multiple Random Variables 26 / 34 T Theorem 4 ( Law of the unconscious statistician ) Suppose Y g() for some function g : R n! R k.then 8 >< g()p() E(Y ) >: g()f() d R n if 1,..., n are discrete if 1,..., n are jointly continuous Proof: We only prove the discrete case. Since ( 1,..., n ) can only take a countable number of values with positive probability, the same is true for (Y 1,...,Y k ) T Y g() so Y 1,...,Y k are discrete random variables. Proof cont d: Thus E(Y ) yp (Y y) yp (g() y) y y y P ( ) y :g()y g()p ( ) y :g()y g()p ( ) g()p() Eample: Linearity of epectation... E(a 0 + a a n n )a 0 + a 1 E( 1 )++ a n E( n ) STAT/MTHE 353: Multiple Random Variables 27 / 34 STAT/MTHE 353: Multiple Random Variables 28 / 34
8 Transformation of Multiple Random Variables Eample: Epected value of a binomial random variable... Eample: Suppose we have n bar magnets, each having negative polarity at one end and positive polarity at the other end. Line up the magnets end-to-end in such a way that the orientation of each magnet is random (the two choices are equally likely independently of the others). On the average, how many segments of magnets that stick together do we obtain? Suppose 1,..., n are jointly continuous with joint pdf f( 1,..., n ). Let h : R n! R n be a continuously di erentiable and one-to-one (invertible) function whose inverse g is also continuously di erentiable. Thus h is given by ( 1,..., n ) 7! (y 1,...,y n ),where y 1 h 1 ( 1,..., n ), y 2 h 2 ( 1,..., n ),..., y n h n ( 1,..., n ) We want to find the joint pdf of the vector Y (Y 1,...,Y n ) T, where Y 1 h 1 ( 1,..., n ) Y 2 h 2 ( 1,..., n ). Y n h n ( 1,..., n ) STAT/MTHE 353: Multiple Random Variables 29 / 34 STAT/MTHE 353: Multiple Random Variables 30 / 34 Let g : R n! R n denote the inverse of h. LetB R n be a nice set. We have P (Y 1,...,Y n ) 2 B P h() 2 B P ( 1,..., n ) 2 A where A { 2 R n : h() 2 B} h 1 (B) g(b). The multivariate change of variables formula for g(y) implies that P ( 1,..., n ) 2 A f( 1,..., n ) d 1 d n B B g(b) f g 1 (y 1,...,y n ),...,g n (y 1,...,y n ) J g (y 1,...,y n ) dy 1 dy n f(g(y)) J g (y) dy We have shown that for nice B R n P (Y 1,...,Y n ) 2 B This implies the following: B f(g(y)) J g (y) dy Theorem 5 (Transformation of Multiple Random Variables) Suppose 1,..., n are jointly continuous with joint pdf f( 1,..., n ). Let h : R n! R n be a continuously di erentiable and one-to-one function with continuously di erentiable inverse g. Then the joint pdf of Y (Y 1,...,Y n ) T h() is f Y (y 1,...,y n )f g 1 (y 1,...,y n ),...,g n (y 1,...,y n ) J g (y 1,...,y n ). where J g is the Jacobian of the transformation g. STAT/MTHE 353: Multiple Random Variables 31 / 34 STAT/MTHE 353: Multiple Random Variables 32 / 34
9 Eample: Suppose ( 1,..., n ) T has joint pdf f and let Y A, wherea is an invertible n n (real) matri. Find f Y. Often we are interested in the pdf of just a single function of 1,..., n,sayy 1 h 1 ( 1,..., n ). (1) Define Y i h i ( 1,..., n ), i 2,...,n in such a way that the mapping h (h 1,...,h n ) satisfies the conditions of the theorem (h has an inverse g which is continuously di erentiable). Then the theorem gives the joint pdf f Y (y 1,...,y n ) and we obtain f Y1 (y 1 ) by integrating out y 2,...,y n : f Y1 (y 1 ) f Y (y 1,...,y n ) dy 2...dy n R n 1 (2) Often it is easier to directly compute the cdf of Y 1 : F Y1 (y) P (Y 1 apple y) P h 1 ( 1,..., n ) apple y P ( 1,..., n ) 2 A y where A y {( 1,..., n ):h( 1,..., n ) apple y} f( 1,..., n ) d 1 d n A y Di erentiating F Y1 we obtain the pdf of Y 1. Eample: Let 1,..., n be independent with common distribution Uniform(0, 1). Determine the pdf of Y min( 1,..., n ). A common choice is Y i i, i 2,...,n. STAT/MTHE 353: Multiple Random Variables 33 / 34 STAT/MTHE 353: Multiple Random Variables 34 / 34
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete
Random variables, probability distributions, binomial random variable
Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that
Lecture 6: Discrete & Continuous Probability and Random Variables
Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September
160 CHAPTER 4. VECTOR SPACES
160 CHAPTER 4. VECTOR SPACES 4. Rank and Nullity In this section, we look at relationships between the row space, column space, null space of a matrix and its transpose. We will derive fundamental results
Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions
Chapter 4 - Lecture 1 Probability Density Functions and Cumulative Distribution Functions October 21st, 2009 Review Probability distribution function Useful results Relationship between the pdf and the
ST 371 (IV): Discrete Random Variables
ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible
Math 431 An Introduction to Probability. Final Exam Solutions
Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <
FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL
FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint
Section 6.1 Joint Distribution Functions
Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function
Probability and Random Variables. Generation of random variables (r.v.)
Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly
MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.
MATH10212 Linear Algebra Textbook: D. Poole, Linear Algebra: A Modern Introduction. Thompson, 2006. ISBN 0-534-40596-7. Systems of Linear Equations Definition. An n-dimensional vector is a row or a column
M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung
M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS
Practice with Proofs
Practice with Proofs October 6, 2014 Recall the following Definition 0.1. A function f is increasing if for every x, y in the domain of f, x < y = f(x) < f(y) 1. Prove that h(x) = x 3 is increasing, using
Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.
Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()
Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur
Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce
Quadratic forms Cochran s theorem, degrees of freedom, and all that
Quadratic forms Cochran s theorem, degrees of freedom, and all that Dr. Frank Wood Frank Wood, [email protected] Linear Regression Models Lecture 1, Slide 1 Why We Care Cochran s theorem tells us
Microeconomic Theory: Basic Math Concepts
Microeconomic Theory: Basic Math Concepts Matt Van Essen University of Alabama Van Essen (U of A) Basic Math Concepts 1 / 66 Basic Math Concepts In this lecture we will review some basic mathematical concepts
CURVE FITTING LEAST SQUARES APPROXIMATION
CURVE FITTING LEAST SQUARES APPROXIMATION Data analysis and curve fitting: Imagine that we are studying a physical system involving two quantities: x and y Also suppose that we expect a linear relationship
People have thought about, and defined, probability in different ways. important to note the consequences of the definition:
PROBABILITY AND LIKELIHOOD, A BRIEF INTRODUCTION IN SUPPORT OF A COURSE ON MOLECULAR EVOLUTION (BIOL 3046) Probability The subject of PROBABILITY is a branch of mathematics dedicated to building models
Continued Fractions and the Euclidean Algorithm
Continued Fractions and the Euclidean Algorithm Lecture notes prepared for MATH 326, Spring 997 Department of Mathematics and Statistics University at Albany William F Hammond Table of Contents Introduction
Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density
HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,
NOTES ON LINEAR TRANSFORMATIONS
NOTES ON LINEAR TRANSFORMATIONS Definition 1. Let V and W be vector spaces. A function T : V W is a linear transformation from V to W if the following two properties hold. i T v + v = T v + T v for all
1 Maximizing pro ts when marginal costs are increasing
BEE12 Basic Mathematical Economics Week 1, Lecture Tuesda 12.1. Pro t maimization 1 Maimizing pro ts when marginal costs are increasing We consider in this section a rm in a perfectl competitive market
Math 461 Fall 2006 Test 2 Solutions
Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two
Notes on metric spaces
Notes on metric spaces 1 Introduction The purpose of these notes is to quickly review some of the basic concepts from Real Analysis, Metric Spaces and some related results that will be used in this course.
SF2940: Probability theory Lecture 8: Multivariate Normal Distribution
SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2014 Timo Koski () Mathematisk statistik 24.09.2014 1 / 75 Learning outcomes Random vectors, mean vector, covariance
Math 312 Homework 1 Solutions
Math 31 Homework 1 Solutions Last modified: July 15, 01 This homework is due on Thursday, July 1th, 01 at 1:10pm Please turn it in during class, or in my mailbox in the main math office (next to 4W1) Please
F Matrix Calculus F 1
F Matrix Calculus F 1 Appendix F: MATRIX CALCULUS TABLE OF CONTENTS Page F1 Introduction F 3 F2 The Derivatives of Vector Functions F 3 F21 Derivative of Vector with Respect to Vector F 3 F22 Derivative
10.2 Series and Convergence
10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and
Linear Maps. Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007)
MAT067 University of California, Davis Winter 2007 Linear Maps Isaiah Lankham, Bruno Nachtergaele, Anne Schilling (February 5, 2007) As we have discussed in the lecture on What is Linear Algebra? one of
1 Sufficient statistics
1 Sufficient statistics A statistic is a function T = rx 1, X 2,, X n of the random sample X 1, X 2,, X n. Examples are X n = 1 n s 2 = = X i, 1 n 1 the sample mean X i X n 2, the sample variance T 1 =
SF2940: Probability theory Lecture 8: Multivariate Normal Distribution
SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,
Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)
Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume
1.2 Solving a System of Linear Equations
1.. SOLVING A SYSTEM OF LINEAR EQUATIONS 1. Solving a System of Linear Equations 1..1 Simple Systems - Basic De nitions As noticed above, the general form of a linear system of m equations in n variables
Chapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,
HOMEWORK 5 SOLUTIONS. n!f n (1) lim. ln x n! + xn x. 1 = G n 1 (x). (2) k + 1 n. (n 1)!
Math 7 Fall 205 HOMEWORK 5 SOLUTIONS Problem. 2008 B2 Let F 0 x = ln x. For n 0 and x > 0, let F n+ x = 0 F ntdt. Evaluate n!f n lim n ln n. By directly computing F n x for small n s, we obtain the following
Transformations and Expectations of random variables
Transformations and Epectations of random variables X F X (): a random variable X distributed with CDF F X. Any function Y = g(x) is also a random variable. If both X, and Y are continuous random variables,
MULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
Lecture 7: Continuous Random Variables
Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider
Principle of Data Reduction
Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then
Maximum Likelihood Estimation
Math 541: Statistical Theory II Lecturer: Songfeng Zheng Maximum Likelihood Estimation 1 Maximum Likelihood Estimation Maximum likelihood is a relatively simple method of constructing an estimator for
( ) is proportional to ( 10 + x)!2. Calculate the
PRACTICE EXAMINATION NUMBER 6. An insurance company eamines its pool of auto insurance customers and gathers the following information: i) All customers insure at least one car. ii) 64 of the customers
CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION
No: CITY UNIVERSITY LONDON BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION ENGINEERING MATHEMATICS 2 (resit) EX2005 Date: August
Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf
AMS 3 Joe Mitchell Eamples: Joint Densities and Joint Mass Functions Eample : X and Y are jointl continuous with joint pdf f(,) { c 2 + 3 if, 2, otherwise. (a). Find c. (b). Find P(X + Y ). (c). Find marginal
E3: PROBABILITY AND STATISTICS lecture notes
E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................
Linear Algebra Notes for Marsden and Tromba Vector Calculus
Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of
MBA 611 STATISTICS AND QUANTITATIVE METHODS
MBA 611 STATISTICS AND QUANTITATIVE METHODS Part I. Review of Basic Statistics (Chapters 1-11) A. Introduction (Chapter 1) Uncertainty: Decisions are often based on incomplete information from uncertain
Partial Fractions Decomposition
Partial Fractions Decomposition Dr. Philippe B. Laval Kennesaw State University August 6, 008 Abstract This handout describes partial fractions decomposition and how it can be used when integrating rational
v w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.
3. Cross product Definition 3.1. Let v and w be two vectors in R 3. The cross product of v and w, denoted v w, is the vector defined as follows: the length of v w is the area of the parallelogram with
Quotient Rings and Field Extensions
Chapter 5 Quotient Rings and Field Extensions In this chapter we describe a method for producing field extension of a given field. If F is a field, then a field extension is a field K that contains F.
Partial Derivatives. @x f (x; y) = @ x f (x; y) @x x2 y + @ @x y2 and then we evaluate the derivative as if y is a constant.
Partial Derivatives Partial Derivatives Just as derivatives can be used to eplore the properties of functions of 1 variable, so also derivatives can be used to eplore functions of 2 variables. In this
88 CHAPTER 2. VECTOR FUNCTIONS. . First, we need to compute T (s). a By definition, r (s) T (s) = 1 a sin s a. sin s a, cos s a
88 CHAPTER. VECTOR FUNCTIONS.4 Curvature.4.1 Definitions and Examples The notion of curvature measures how sharply a curve bends. We would expect the curvature to be 0 for a straight line, to be very small
IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem
IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have
Math 115 Spring 2011 Written Homework 5 Solutions
. Evaluate each series. a) 4 7 0... 55 Math 5 Spring 0 Written Homework 5 Solutions Solution: We note that the associated sequence, 4, 7, 0,..., 55 appears to be an arithmetic sequence. If the sequence
Stats on the TI 83 and TI 84 Calculator
Stats on the TI 83 and TI 84 Calculator Entering the sample values STAT button Left bracket { Right bracket } Store (STO) List L1 Comma Enter Example: Sample data are {5, 10, 15, 20} 1. Press 2 ND and
CAPM, Arbitrage, and Linear Factor Models
CAPM, Arbitrage, and Linear Factor Models CAPM, Arbitrage, Linear Factor Models 1/ 41 Introduction We now assume all investors actually choose mean-variance e cient portfolios. By equating these investors
You flip a fair coin four times, what is the probability that you obtain three heads.
Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.
Option Pricing. Chapter 12 - Local volatility models - Stefan Ankirchner. University of Bonn. last update: 13th January 2014
Option Pricing Chapter 12 - Local volatility models - Stefan Ankirchner University of Bonn last update: 13th January 2014 Stefan Ankirchner Option Pricing 1 Agenda The volatility surface Local volatility
Functions: Piecewise, Even and Odd.
Functions: Piecewise, Even and Odd. MA161/MA1161: Semester 1 Calculus. Prof. Götz Pfeiffer School of Mathematics, Statistics and Applied Mathematics NUI Galway September 21-22, 2015 Tutorials, Online Homework.
Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab
Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS Systems of Equations and Matrices Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a
α = u v. In other words, Orthogonal Projection
Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v
Statistics 100A Homework 7 Solutions
Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase
3. Mathematical Induction
3. MATHEMATICAL INDUCTION 83 3. Mathematical Induction 3.1. First Principle of Mathematical Induction. Let P (n) be a predicate with domain of discourse (over) the natural numbers N = {0, 1,,...}. If (1)
Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013
Notes on Orthogonal and Symmetric Matrices MENU, Winter 201 These notes summarize the main properties and uses of orthogonal and symmetric matrices. We covered quite a bit of material regarding these topics,
LS.6 Solution Matrices
LS.6 Solution Matrices In the literature, solutions to linear systems often are expressed using square matrices rather than vectors. You need to get used to the terminology. As before, we state the definitions
1 Calculus of Several Variables
1 Calculus of Several Variables Reading: [Simon], Chapter 14, p. 300-31. 1.1 Partial Derivatives Let f : R n R. Then for each x i at each point x 0 = (x 0 1,..., x 0 n) the ith partial derivative is defined
These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.
DEFINITION: A vector space is a nonempty set V of objects, called vectors, on which are defined two operations, called addition and multiplication by scalars (real numbers), subject to the following axioms
Cartesian Products and Relations
Cartesian Products and Relations Definition (Cartesian product) If A and B are sets, the Cartesian product of A and B is the set A B = {(a, b) :(a A) and (b B)}. The following points are worth special
FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES
FUNCTIONAL ANALYSIS LECTURE NOTES: QUOTIENT SPACES CHRISTOPHER HEIL 1. Cosets and the Quotient Space Any vector space is an abelian group under the operation of vector addition. So, if you are have studied
PYTHAGOREAN TRIPLES KEITH CONRAD
PYTHAGOREAN TRIPLES KEITH CONRAD 1. Introduction A Pythagorean triple is a triple of positive integers (a, b, c) where a + b = c. Examples include (3, 4, 5), (5, 1, 13), and (8, 15, 17). Below is an ancient
Lecture 2. Marginal Functions, Average Functions, Elasticity, the Marginal Principle, and Constrained Optimization
Lecture 2. Marginal Functions, Average Functions, Elasticity, the Marginal Principle, and Constrained Optimization 2.1. Introduction Suppose that an economic relationship can be described by a real-valued
3 Contour integrals and Cauchy s Theorem
3 ontour integrals and auchy s Theorem 3. Line integrals of complex functions Our goal here will be to discuss integration of complex functions = u + iv, with particular regard to analytic functions. Of
4. Continuous Random Variables, the Pareto and Normal Distributions
4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random
Common sense, and the model that we have used, suggest that an increase in p means a decrease in demand, but this is not the only possibility.
Lecture 6: Income and Substitution E ects c 2009 Je rey A. Miron Outline 1. Introduction 2. The Substitution E ect 3. The Income E ect 4. The Sign of the Substitution E ect 5. The Total Change in Demand
Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution
Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover
1 if 1 x 0 1 if 0 x 1
Chapter 3 Continuity In this chapter we begin by defining the fundamental notion of continuity for real valued functions of a single real variable. When trying to decide whether a given function is or
Statistics 100A Homework 4 Solutions
Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation
Representation of functions as power series
Representation of functions as power series Dr. Philippe B. Laval Kennesaw State University November 9, 008 Abstract This document is a summary of the theory and techniques used to represent functions
Chapter 3 RANDOM VARIATE GENERATION
Chapter 3 RANDOM VARIATE GENERATION In order to do a Monte Carlo simulation either by hand or by computer, techniques must be developed for generating values of random variables having known distributions.
STAT 830 Convergence in Distribution
STAT 830 Convergence in Distribution Richard Lockhart Simon Fraser University STAT 830 Fall 2011 Richard Lockhart (Simon Fraser University) STAT 830 Convergence in Distribution STAT 830 Fall 2011 1 / 31
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written
6.4 Normal Distribution
Contents 6.4 Normal Distribution....................... 381 6.4.1 Characteristics of the Normal Distribution....... 381 6.4.2 The Standardized Normal Distribution......... 385 6.4.3 Meaning of Areas under
Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X
Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random
Multi-variable Calculus and Optimization
Multi-variable Calculus and Optimization Dudley Cooke Trinity College Dublin Dudley Cooke (Trinity College Dublin) Multi-variable Calculus and Optimization 1 / 51 EC2040 Topic 3 - Multi-variable Calculus
Similarity and Diagonalization. Similar Matrices
MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that
5. Continuous Random Variables
5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be
Inner Product Spaces
Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and
minimal polyonomial Example
Minimal Polynomials Definition Let α be an element in GF(p e ). We call the monic polynomial of smallest degree which has coefficients in GF(p) and α as a root, the minimal polyonomial of α. Example: We
For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )
Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll
Linear Equations in Linear Algebra
1 Linear Equations in Linear Algebra 1.5 SOLUTION SETS OF LINEAR SYSTEMS HOMOGENEOUS LINEAR SYSTEMS A system of linear equations is said to be homogeneous if it can be written in the form A 0, where A
Econometrics Simple Linear Regression
Econometrics Simple Linear Regression Burcu Eke UC3M Linear equations with one variable Recall what a linear equation is: y = b 0 + b 1 x is a linear equation with one variable, or equivalently, a straight
Unified Lecture # 4 Vectors
Fall 2005 Unified Lecture # 4 Vectors These notes were written by J. Peraire as a review of vectors for Dynamics 16.07. They have been adapted for Unified Engineering by R. Radovitzky. References [1] Feynmann,
PSTAT 120B Probability and Statistics
- Week University of California, Santa Barbara April 10, 013 Discussion section for 10B Information about TA: Fang-I CHU Office: South Hall 5431 T Office hour: TBA email: [email protected] Slides will
Markov random fields and Gibbs measures
Chapter Markov random fields and Gibbs measures 1. Conditional independence Suppose X i is a random element of (X i, B i ), for i = 1, 2, 3, with all X i defined on the same probability space (.F, P).
A Detailed Price Discrimination Example
A Detailed Price Discrimination Example Suppose that there are two different types of customers for a monopolist s product. Customers of type 1 have demand curves as follows. These demand curves include
Approximation of Aggregate Losses Using Simulation
Journal of Mathematics and Statistics 6 (3): 233-239, 2010 ISSN 1549-3644 2010 Science Publications Approimation of Aggregate Losses Using Simulation Mohamed Amraja Mohamed, Ahmad Mahir Razali and Noriszura
Core Maths C1. Revision Notes
Core Maths C Revision Notes November 0 Core Maths C Algebra... Indices... Rules of indices... Surds... 4 Simplifying surds... 4 Rationalising the denominator... 4 Quadratic functions... 4 Completing the
Walrasian Demand. u(x) where B(p, w) = {x R n + : p x w}.
Walrasian Demand Econ 2100 Fall 2015 Lecture 5, September 16 Outline 1 Walrasian Demand 2 Properties of Walrasian Demand 3 An Optimization Recipe 4 First and Second Order Conditions Definition Walrasian
Probability Generating Functions
page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence
