Probability Theory. Florian Herzog. A random variable is neither random nor variable. GianCarlo Rota, M.I.T..


 Dylan Willis
 2 years ago
 Views:
Transcription
1 Probability Theory A random variable is neither random nor variable. GianCarlo Rota, M.I.T.. Florian Herzog 2013
2 Probability space Probability space A probability space W is a unique triple W = {Ω, F, P }: Ω is its sample space F its σalgebra of events P its probability measure Remarks: (1) The sample space Ω is the set of all possible samples or elementary events ω: Ω = {ω ω Ω}. (2)The σalgebra F is the set of all of the considered events A, i.e., subsets of Ω: F = {A A Ω, A F}. (3) The probability measure P assigns a probability P (A) to every event A F: P : F [0, 1]. Stochastic Systems,
3 Sample space The sample space Ω is sometimes called the universe of all samples or possible outcomes ω. Example 1. Sample space Toss of a coin (with head and tail): Ω = {H, T }. Two tosses of a coin: Ω = {HH, HT, T H, T T }. A cubic die: Ω = {ω 1, ω 2, ω 3, ω 4, ω 5, ω 6 }. The positive integers: Ω = {1, 2, 3,... }. The reals: Ω = {ω ω R}. Note that the ωs are a mathematical construct and have per se no real or scientific meaning. The ωs in the die example refer to the numbers of dots observed when the die is thrown. Stochastic Systems,
4 Event An event A is a subset of Ω. If the outcome ω of the experiment is in the subset A, then the event A is said to have occurred. The set of all subsets of the sample space are denoted by 2 Ω. Example 2. Events Head in the coin toss: A = {H}. Odd number in the roll of a die: A = {ω 1, ω 3, ω 5 }. An integer smaller than 5: A = {1, 2, 3, 4}, where Ω = {1, 2, 3,... }. A real number between 0 and 1: A = [0, 1], where Ω = {ω ω R}. We denote the complementary event of A by A c = Ω\A. When it is possible to determine whether an event A has occurred or not, we must also be able to determine whether A c has occurred or not. Stochastic Systems,
5 Probability Measure I Definition 1. Probability measure A probability measure P on the countable sample space Ω is a set function P : F [0, 1], satisfying the following conditions P(Ω) = 1. P(ω i ) = p i. If A 1, A 2, A 3,... F are mutually disjoint, then ( ) P A i = i=1 P (A i ). i=1 Stochastic Systems,
6 Probability The story so far: Sample space: Ω = {ω 1,..., ω n }, finite! Events: F = 2 Ω : All subsets of Ω Probability: P (ω i ) = p i P (A Ω) = w i A p i Probability axioms of Kolmogorov (1931) for elementary probability: P(Ω) = 1. If A Ω then P (A) 0. If A 1, A 2, A 3,... Ω are mutually disjoint, then ( ) P A i = i=1 P (A i ). i=1 Stochastic Systems,
7 Uncountable sample spaces Most important uncountable sample space for engineering: R, resp. R n. Consider the example Ω = [0, 1], every ω is equally likely. Obviously, P(ω) = 0. Intuitively, P([0, a]) = a, basic concept: length! Question: Has every subset of [0, 1] a determinable length? Answer: No! (e.g. Vitali sets, BanachTarski paradox) Question: Is this of importance in practice? Answer: No! Question: Does it matter for the underlying theory? Answer: A lot! Stochastic Systems,
8 Fundamental mathematical tools Not every subset of [0, 1] has a determinable length collect the ones with a determinable length in F. Such a mathematical construct, which has additional, desirable properties, is called σalgebra. Definition 2. σalgebra A collection F of subsets of Ω is called a σalgebra on Ω if Ω F and F ( denotes the empty set) If A F then Ω\A = A c F: The complementary subset of A is also in Ω For all A i F: i=1 A i F The intuition behind it: collect all events in the σalgebra F, make sure that by performing countably many elementary set operation (,, c ) on elements of F yields again an element in F (closeness). The pair {Ω, F} is called measure space. Stochastic Systems,
9 Example of σalgebra Example 3. σalgebra of two cointosses Ω = {HH, HT, T H, T T } = {ω 1, ω 2, ω 3, ω 4 } F min = {, Ω} = {, {ω 1, ω 2, ω 3, ω 4 }}. F 1 = {, {ω 1, ω 2 }, {ω 3, ω 4 }, {ω 1, ω 2, ω 3, ω 4 }}. F max = {, {ω 1 }, {ω 2 }, {ω 3 }, {ω 4 }, {ω 1, ω 2 }, {ω 1, ω 3 }, {ω 1, ω 4 }, {ω 2, ω 3 }, {ω 2, ω 4 }, {ω 3, ω 4 }, {ω 1, ω 2, ω 3 }, {ω 1, ω 2, ω 4 }, {ω 1, ω 3, ω 4 }, {ω 2, ω 3, ω 4 }, Ω}. Stochastic Systems,
10 Generated σalgebras The concept of generated σalgebras is important in probability theory. Definition 3. σ(c): σalgebra generated by a class C of subsets Let C be a class of subsets of Ω. The σalgebra generated by C, denoted by σ(c), is the smallest σalgebra F which includes all elements of C, i.e., C F. Identify the different events we can measure of an experiment (denoted by A), we then just work with the σalgebra generated by A and have avoided all the measure theoretic technicalities. Stochastic Systems,
11 Borel σalgebra The Borel σalgebra includes all subsets of R which are of interest in practical applications (scientific or engineering). Definition 4. Borel σalgebra B(R)) The Borel σalgebra B(R) is the smallest σalgebra containing all open intervals in R. The sets in B(R) are called Borel sets. The extension to the multidimensional case, B(R n ), is straightforward. (, a), (b, ), (, a) (b, ) [a, b] = (, a) (b, ), (, a] = n=1 [a n, a] and [b, ) = n=1 [b, b + n], (a, b] = (, b] (a, ), {a} = n=1 (a 1 n, a + 1 n ), {a 1,, a n } = n k=1 a k. Stochastic Systems,
12 Measure Definition 5. Measure Let F be the σalgebra of Ω and therefore (Ω, F) be a measurable space. The map µ : F [0, ] is called a measure on (Ω, F) if µ is countably additive. The measure µ is countably additive (or σadditive) if µ( ) = 0 and for every sequence of disjoint sets (F i : i N) in F with F = i N F i we have µ(f ) = i N µ(f i ). If µ is countably additive, it is also additive, meaning for every F, G F we have µ(f G) = µ(f ) + µ(g) if and only if F G = The triple (Ω, F, µ) is called a measure space. Stochastic Systems,
13 Lebesgue Measure The measure of length on the straight line is known as the Lebesgue measure. Definition 6. Lebesgue measure on B(R) The Lebesgue measure on B(R), denoted by λ, is defined as the measure on (R, B(R)) which assigns the measure of each interval to be its length. Examples: Lebesgue measure of one point: λ({a}) = 0. Lebesgue measure of countably many points: λ(a) = i=1 λ({a i}) = 0. The Lebesgue measure of a set containing uncountably many points: zero positive and finite infinite Stochastic Systems,
14 Probability Measure Definition 7. Probability measure A probability measure P on the sample space Ω with σalgebra F is a set function P : F [0, 1], satisfying the following conditions P(Ω) = 1. If A F then P (A) 0. If A 1, A 2, A 3,... F are mutually disjoint, then ( ) P A i = i=1 P (A i ). i=1 The triple (Ω, F, P ) is called a probability space. Stochastic Systems,
15 Fmeasurable functions Definition 8. Fmeasurable function The function f : Ω R defined on (Ω, F, P ) is called Fmeasurable if f 1 (B) = {ω Ω : f(ω) B} F for all B B(R), i.e., the inverse f 1 maps all of the Borel sets B R to F. Sometimes it is easier to work with following equivalent condition: y R {ω Ω : f(ω) y} F This means that once we know the (random) value X(ω) we know which of the events in F have happened. F = {, Ω}: only constant functions are measurable F = 2 Ω : all functions are measurable Stochastic Systems,
16 Fmeasurable functions Ω: Sample space, A i : Event, F = σ(a 1,..., A n ): σalgebra of events X(ω) : Ω R R ω A 1 A 2 X(ω) R (, a] B A 3 A 4 X 1 : B F Ω X: random variable, B: Borel σalgebra Stochastic Systems,
17 Fmeasurable functions  Example Roll of a die: Ω = {ω 1, ω 2, ω 3, ω 4, ω 5, ω 6 } We only know, whether an even or and odd number has shown up: F = {, {ω 1, ω 3, ω 5 }, {ω 2, ω 4, ω 6 }, Ω} = σ({ω 1, ω 3, ω 5 }) Consider the following random variable: f(ω) = { 1, if ω = ω1, ω 2, ω 3 ; 1, if ω = ω 4, ω 5, ω 6. Check measurability with the condition y R {ω Ω : f(ω) y} F {ω Ω : f(ω) 0} = {ω 4, ω 5, ω 6 } / F f is not Fmeasurable. Stochastic Systems,
18 Lebesgue integral I Definition 9. Lebesgue Integral (Ω, F) a measure space, µ : Ω R a measure, f : Ω R is Fmeasurable. If f is a simple function, i.e., f(x) = c i, for all x A i, c i R Ω fdµ = n c i µ(a i ). If f is nonnegative, we can always construct a sequence of simple functions f n with f n (x) f n+1 (x) which converges to f: lim n f n (x) = f(x). With this sequence, the Lebesgue integral is defined by fdµ = lim f n dµ. Ω n Ω i=1 Stochastic Systems,
19 Lebesgue integral II Definition 10. Lebesgue Integral (Ω, F) a measure space, µ : Ω R a measure, f : Ω R is Fmeasurable. If f is an arbitrary, measurable function, we have f = f + f with f + (x) = max(f(x), 0) and f (x) = max( f(x), 0), and then define Ω fdµ = Ω f + dp Ω f dp. The integral above may be finite or infinite. It is not defined if Ω f + dp and Ω f dp are both infinite. Stochastic Systems,
20 Riemann vs. Lebesgue The most important concept of the Lebesgue integral is that the limit of approximate sums (as the Riemann integral): for Ω = R: f(x) f(x) f x x x Stochastic Systems,
21 Riemann vs. Lebesgue integral Theorem 1. RiemannLebesgue integral equivalence Let f be a bounded and continuous function on [x 1, x 2 ] except at a countable number of points in [x 1, x 2 ]. Then both the Riemann and the Lebesgue integral with Lebesgue measure µ exist and are the same: x2 x 1 f(x) dx = [x 1,x 2 ] fdµ. There are more functions which are Lebesgue integrable than Riemann integrable. Stochastic Systems,
22 Popular example for Riemann vs. Lebesgue Consider the function f(x) = { 0, x Q; 1, x R\Q. The Riemann integral 1 0 f(x)dx does not exist, since lower and upper sum do not converge to the same value. However, the Lebesgue integral fdλ = 1 [0,1] does exist, since f(x) is the indicator function of x R\Q. Stochastic Systems,
23 Random Variable Definition 11. Random variable A realvalued random variable X is a Fmeasurable function defined on a probability space (Ω, F, P ) mapping its sample space Ω into the real line R: X : Ω R. Since X is Fmeasurable we have X 1 : B F. Stochastic Systems,
24 Distribution function Definition 12. Distribution function The distribution function of a random variable X, defined on a probability space (Ω, F, P ), is defined by: F (x) = P (X(ω) x) = P ({ω : X(ω) x}). From this the probability measure of the halfopen sets in R is P (a < X b) = P ({ω : a < X(ω) b}) = F (b) F (a). Stochastic Systems,
25 Density function Closely related to the distribution function is the density function. Let f : R R be a nonnegative function, satisfying fdλ = 1. The function f is called R a density function (with respect to the Lebesgue measure) and the associated probability measure for a random variable X, defined on (Ω, F, P ), is for all A F. P ({ω : ω A}) = A fdλ. Stochastic Systems,
26 Important Densities I Poisson density or probability mass function (λ > 0): Univariate Normal density f(x) = λx x! e λ, x = 0, 1, 2,.... f(x) = 1 2πσ e 1 2 ( ) x µ σ. The normal variable is abbreviated as N (µ, σ). Multivariate normal density (x, µ R n ; Σ R n n ): f(x) = 1 (2π)n det(σ) e 1 2 (x µ)t Σ 1 (x µ). Stochastic Systems,
27 Important Densities II Univariate student tdensity ν degrees of freedom (x, µ R 1 ; σ R 1 f(x) = Γ(ν+1 2 ) ( Γ( ν 2 ) (x µ) 2 ) 1 2 (ν+1) πνσ ν σ 2 Multivariate student tdensity with ν degrees of freedom (x, µ R n ; Σ R n n ): f(x) = Γ( ν+n 2 ) ( Γ( ν 2 ) (πν) n det(σ) ν (x µ)t Σ (ν+n) (x µ)). Stochastic Systems,
28 Important Densities II The chi square distribution with degreeoffreedom (dof) n has the following density )n 2 2 ( f(x) = e x 2 x 2 2Γ( n 2 ) which is abreviated as Z χ 2 (n) and where Γ denotes the gamma function. A chi square distributed random variable Y is created by Y = n i=1 X 2 i where X are independent standard normal distributed random variables N (0, 1). Stochastic Systems,
29 Important Densities III A standard studentt distributed random variable Y is generated by Y = X Z ν, where X N (0, 1) and Z χ 2 (ν). Another important density is the Laplace distribution: p(x) = 1 x µ 2σ e σ with mean µ and diffusion σ. The variance of this distribution is given as 2σ 2. Stochastic Systems,
30 Expectation & Variance Definition 13. Expectation of a random variable The expectation of a random variable X, defined on a probability space (Ω, F, P ), is defined by: E[X] = Ω XdP = Ω xfdλ. With this definition at hand, it does not matter what the sample Ω is. The calculations for the two familiar cases of a finite Ω and Ω R with continuous random variables remain the same. Definition 14. Variance of a random variable The variance of a random variable X, defined on a probability space (Ω, F, P ), is defined by: var(x) = E[(X E[X]) 2 ] = (X E[X]) 2 dp = E[X 2 ] E[X] 2. Ω Stochastic Systems,
31 Normally distributed random variables The shorthand notation X N (µ, σ 2 ) for normally distributed random variables with parameters µ and σ is often found in the literature. The following properties are useful when dealing with normally distributed random variables: If X N (µ, σ 2 )) and Y = ax + b, then Y N (aµ + b, a 2 σ 2 )). If X 1 N (µ 1, σ 2 1 )) and X 2 N (µ 2, σ 2 2 )) then X 1 + X 2 N (µ 1 + µ 2, σ σ2 2 ) (if X 1 and X 2 are independent) Stochastic Systems,
32 Conditional Expectation I From elementary probability theory (Bayes rule): P (A B) = P (A B) P (B) = P (B A)P (A) P (B), P (B) > 0. A A B B Ω E(X B) = E(XI B) P (B), P (B) > 0. Stochastic Systems,
33 Conditional Expectation II General case: (Ω, F, P ) Definition 15. Conditional expectation Let X be a random variable defined on the probability space (Ω, F, P ) with E[ X ] <. Furthermore let G be a subσalgebra of F (G F). Then there exists a random variable Y with the following properties: 1. Y is Gmeasurable. 2. E[ Y ] <. 3. For all sets G in G we have Y dp = G G XdP, for all G G. The random variable Y = E[X G] is called conditional expectation. Stochastic Systems,
34 Conditional Expectation: Example Y is a piecewise linear approximation of X. Y = E[X G] X(ω) Y X G 1 G 2 G 3 G 4 G 5 ω For the trivial σalgebra {, Ω}: Y = E[X {, Ω}] = XdP = E[X]. Ω Stochastic Systems,
35 Conditional Expectation: Properties E(E(X F)) = E(X). If X is Fmeasurable, then E(X F) = X. Linearity: E(αX 1 + βx 2 F) = αe(x 1 F) + βe(x 2 F). Positivity: If X 0 almost surely, then E(X F) 0. Tower property: If G is a subσalgebra of F, then E(E(X F) G) = E(X G). Taking out what is known: If Z is Gmeasurable, then E(ZX G) = Z E(X G). Stochastic Systems,
36 Summary σalgebra: collection of the events of interest, closed under elementary set operations Borel σalgebra: all the events of practical importance in R Lebesgue measure: defined as the length of an interval Density: transforms Lebesgue measure in a probability measure Measurable function: the σalgebra of the probability space is rich enough Random variable X: a measurable function X : Ω R Expectation, Variance Conditional expectation is a piecewise linear approximation of the underlying random variable. Stochastic Systems,
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete
More informationThe sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].
Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real
More informationProbabilities and Random Variables
Probabilities and Random Variables This is an elementary overview of the basic concepts of probability theory. 1 The Probability Space The purpose of probability theory is to model random experiments so
More informationChapter 1  σalgebras
Page 1 of 17 PROBABILITY 3  Lecture Notes Chapter 1  σalgebras Let Ω be a set of outcomes. We denote by P(Ω) its power set, i.e. the collection of all the subsets of Ω. If the cardinality Ω of Ω is
More informationThe Lebesgue Measure and Integral
The Lebesgue Measure and Integral Mike Klaas April 12, 2003 Introduction The Riemann integral, defined as the limit of upper and lower sums, is the first example of an integral, and still holds the honour
More informationChapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a realvalued function defined on the sample space of some experiment. For instance,
More informationExample. Two fair dice are tossed and the two outcomes recorded. As is familiar, we have
Lectures 910 jacques@ucsd.edu 5.1 Random Variables Let (Ω, F, P ) be a probability space. The Borel sets in R are the sets in the smallest σ field on R that contains all countable unions and complements
More informationRANDOM INTERVAL HOMEOMORPHISMS. MICHA L MISIUREWICZ Indiana University Purdue University Indianapolis
RANDOM INTERVAL HOMEOMORPHISMS MICHA L MISIUREWICZ Indiana University Purdue University Indianapolis This is a joint work with Lluís Alsedà Motivation: A talk by Yulij Ilyashenko. Two interval maps, applied
More informationLecture Notes on Measure Theory and Functional Analysis
Lecture Notes on Measure Theory and Functional Analysis P. Cannarsa & T. D Aprile Dipartimento di Matematica Università di Roma Tor Vergata cannarsa@mat.uniroma2.it daprile@mat.uniroma2.it aa 2006/07 Contents
More informationPROBABILITY AND MEASURE
PROBABILITY AND MEASURE J. R. NORRIS Contents 1. Measures 3 2. Measurable functions and random variables 11 3. Integration 18 4. Norms and inequalities 28 5. Completeness of L p and orthogonal projection
More informationMATHEMATICS 117/E217, SPRING 2012 PROBABILITY AND RANDOM PROCESSES WITH ECONOMIC APPLICATIONS Module #5 (Borel Field and Measurable Functions )
MATHEMATICS 117/E217, SPRING 2012 PROBABILITY AND RANDOM PROCESSES WITH ECONOMIC APPLICATIONS Module #5 (Borel Field and Measurable Functions ) Last modified: May 1, 2012 Reading Dineen, pages 5767 Proof
More informationChapter 1. Probability Theory: Introduction. Definitions Algebra. RS Chapter 1 Random Variables
Chapter 1 Probability Theory: Introduction Definitions Algebra Definitions: Semiring A collection of sets F is called a semiring if it satisfies: Φ F. If A, B F, then A B F. If A, B F, then there exists
More informationE3: PROBABILITY AND STATISTICS lecture notes
E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................
More informationST 371 (IV): Discrete Random Variables
ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible
More informationProbability and Statistics
CHAPTER 2: RANDOM VARIABLES AND ASSOCIATED FUNCTIONS 2b  0 Probability and Statistics Kristel Van Steen, PhD 2 Montefiore Institute  Systems and Modeling GIGA  Bioinformatics ULg kristel.vansteen@ulg.ac.be
More informationLecture 4: Probability Spaces
EE5110: Probability Foundations for Electrical Engineers JulyNovember 2015 Lecture 4: Probability Spaces Lecturer: Dr. Krishna Jagannathan Scribe: Jainam Doshi, Arjun Nadh and Ajay M 4.1 Introduction
More information4. Joint Distributions
Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose
More informationA Measure Theory Tutorial (Measure Theory for Dummies)
A Measure Theory Tutorial (Measure Theory for Dummies) Maya R. Gupta {gupta}@ee.washington.edu Dept of EE, University of Washington Seattle WA, 981952500 UWEE Technical Report Number UWEETR20060008
More informationP (A) = lim P (A) = N(A)/N,
1.1 Probability, Relative Frequency and Classical Definition. Probability is the study of random or nondeterministic experiments. Suppose an experiment can be repeated any number of times, so that we
More informationNotes on Continuous Random Variables
Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes
More information1 Definition of Conditional Expectation
1 DEFINITION OF CONDITIONAL EXPECTATION 4 1 Definition of Conditional Expectation 1.1 General definition Recall the definition of conditional probability associated with Bayes Rule P(A B) P(A B) P(B) For
More informationOur goal first will be to define a product measure on A 1 A 2.
1. Tensor product of measures and Fubini theorem. Let (A j, Ω j, µ j ), j = 1, 2, be two measure spaces. Recall that the new σ algebra A 1 A 2 with the unit element is the σ algebra generated by the
More informationLebesgue Measure on R n
8 CHAPTER 2 Lebesgue Measure on R n Our goal is to construct a notion of the volume, or Lebesgue measure, of rather general subsets of R n that reduces to the usual volume of elementary geometrical sets
More informationMeasure Theory and Probability
Chapter 2 Measure Theory and Probability 2.1 Introduction In advanced probability courses, a random variable (r.v.) is defined rigorously through measure theory, which is an indepth mathematics discipline
More information4.1 4.2 Probability Distribution for Discrete Random Variables
4.1 4.2 Probability Distribution for Discrete Random Variables Key concepts: discrete random variable, probability distribution, expected value, variance, and standard deviation of a discrete random variable.
More informationNotes for STA 437/1005 Methods for Multivariate Data
Notes for STA 437/1005 Methods for Multivariate Data Radford M. Neal, 26 November 2010 Random Vectors Notation: Let X be a random vector with p elements, so that X = [X 1,..., X p ], where denotes transpose.
More informationFor a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )
Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (19031987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll
More informationElements of probability theory
2 Elements of probability theory Probability theory provides mathematical models for random phenomena, that is, phenomena which under repeated observations yield di erent outcomes that cannot be predicted
More informationIn general, we will not be concerned with the probability of individual outcomes of an experiment, but with collections of outcomes, called events. Fo
1 Introduction A Probability Refresher The word probability evokes (in most people) nebulous concepts related to uncertainty, randomness", etc. Probability is also a concept which is hard to characterize
More informationProbability for Estimation (review)
Probability for Estimation (review) In general, we want to develop an estimator for systems of the form: x = f x, u + η(t); y = h x + ω(t); ggggg y, ffff x We will primarily focus on discrete time linear
More informationDefinition of Random Variable A random variable is a function from a sample space S into the real numbers.
.4 Random Variable Motivation example In an opinion poll, we might decide to ask 50 people whether they agree or disagree with a certain issue. If we record a for agree and 0 for disagree, the sample space
More informationMATH41112/61112 Ergodic Theory Lecture Measure spaces
8. Measure spaces 8.1 Background In Lecture 1 we remarked that ergodic theory is the study of the qualitative distributional properties of typical orbits of a dynamical system and that these properties
More informationProbability distributions
Probability distributions (Notes are heavily adapted from Harnett, Ch. 3; Hayes, sections 2.142.19; see also Hayes, Appendix B.) I. Random variables (in general) A. So far we have focused on single events,
More informationPROBABILITY THEORY  PART 1 MEASURE THEORETICAL FRAMEWORK CONTENTS
PROBABILITY THEORY  PART 1 MEASURE THEORETICAL FRAMEWORK MANJUNATH KRISHNAPUR CONTENTS 1. Discrete probability space 2 2. Uncountable probability spaces? 3 3. Sigma algebras and the axioms of probability
More informationRandom variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.
Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationSection 5.1 Continuous Random Variables: Introduction
Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,
More informationSet theory, and set operations
1 Set theory, and set operations Sayan Mukherjee Motivation It goes without saying that a Bayesian statistician should know probability theory in depth to model. Measure theory is not needed unless we
More informationProbability and Statistics Vocabulary List (Definitions for Middle School Teachers)
Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) B Bar graph a diagram representing the frequency distribution for nominal or discrete data. It consists of a sequence
More informationLecture 13: Martingales
Lecture 13: Martingales 1. Definition of a Martingale 1.1 Filtrations 1.2 Definition of a martingale and its basic properties 1.3 Sums of independent random variables and related models 1.4 Products of
More informationand s n (x) f(x) for all x and s.t. s n is measurable if f is. REAL ANALYSIS Measures. A (positive) measure on a measurable space
RAL ANALYSIS A survey of MA 641643, UAB 19992000 M. Griesemer Throughout these notes m denotes Lebesgue measure. 1. Abstract Integration σalgebras. A σalgebra in X is a nonempty collection of subsets
More informationLecture 4: Random Variables
Lecture 4: Random Variables 1. Definition of Random variables 1.1 Measurable functions and random variables 1.2 Reduction of the measurability condition 1.3 Transformation of random variables 1.4 σalgebra
More informationLecture 8. Confidence intervals and the central limit theorem
Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of
More informationModern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh
Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh Peter Richtárik Week 3 Randomized Coordinate Descent With Arbitrary Sampling January 27, 2016 1 / 30 The Problem
More informationMEASURE AND INTEGRATION. Dietmar A. Salamon ETH Zürich
MEASURE AND INTEGRATION Dietmar A. Salamon ETH Zürich 12 May 2016 ii Preface This book is based on notes for the lecture course Measure and Integration held at ETH Zürich in the spring semester 2014. Prerequisites
More informationMathematical Background
Appendix A Mathematical Background A.1 Joint, Marginal and Conditional Probability Let the n (discrete or continuous) random variables y 1,..., y n have a joint joint probability probability p(y 1,...,
More informationJoint Exam 1/P Sample Exam 1
Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question
More informationRANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. DISCRETE RANDOM VARIABLES.. Definition of a Discrete Random Variable. A random variable X is said to be discrete if it can assume only a finite or countable
More informationMaximum Likelihood Estimation
Math 541: Statistical Theory II Lecturer: Songfeng Zheng Maximum Likelihood Estimation 1 Maximum Likelihood Estimation Maximum likelihood is a relatively simple method of constructing an estimator for
More informationChap2: The Real Number System (See Royden pp40)
Chap2: The Real Number System (See Royden pp40) 1 Open and Closed Sets of Real Numbers The simplest sets of real numbers are the intervals. We define the open interval (a, b) to be the set (a, b) = {x
More informationM2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung
M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS
More informationSOME APPLICATIONS OF MARTINGALES TO PROBABILITY THEORY
SOME APPLICATIONS OF MARTINGALES TO PROBABILITY THEORY WATSON LADD Abstract. Martingales are a very simple concept with wide application in probability. We introduce the concept of a martingale, develop
More informationREAL ANALYSIS LECTURE NOTES: 1.4 OUTER MEASURE
REAL ANALYSIS LECTURE NOTES: 1.4 OUTER MEASURE CHRISTOPHER HEIL 1.4.1 Introduction We will expand on Section 1.4 of Folland s text, which covers abstract outer measures also called exterior measures).
More information6. Distribution and Quantile Functions
Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 6. Distribution and Quantile Functions As usual, our starting point is a random experiment with probability measure P on an underlying sample spac
More informationProbability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0
Probability & Statistics Primer Gregory J. Hakim University of Washington 2 January 2009 v2.0 This primer provides an overview of basic concepts and definitions in probability and statistics. We shall
More informationIntroduction to probability theory in the Discrete Mathematics course
Introduction to probability theory in the Discrete Mathematics course Jiří Matoušek (KAM MFF UK) Version: Oct/18/2013 Introduction This detailed syllabus contains definitions, statements of the main results
More informationLecture 6: Discrete & Continuous Probability and Random Variables
Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September
More information( ) = P Z > = P( Z > 1) = 1 Φ(1) = 1 0.8413 = 0.1587 P X > 17
4.6 I company that manufactures and bottles of apple juice uses a machine that automatically fills 6 ounce bottles. There is some variation, however, in the amounts of liquid dispensed into the bottles
More informationRandom Variables and Measurable Functions.
Chapter 3 Random Variables and Measurable Functions. 3.1 Measurability Definition 42 (Measurable function) Let f be a function from a measurable space (Ω, F) into the real numbers. We say that the function
More information5. Continuous Random Variables
5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be
More informationLecture 7: Continuous Random Variables
Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider
More informationMATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables
MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables Tony Pourmohamad Department of Mathematics De Anza College Spring 2015 Objectives By the end of this set of slides,
More informationMath 151. Rumbos Spring 2014 1. Solutions to Assignment #22
Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability
More informationTopic 8: The Expected Value
Topic 8: September 27 and 29, 2 Among the simplest summary of quantitative data is the sample mean. Given a random variable, the corresponding concept is given a variety of names, the distributional mean,
More informationLecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett
Lecture Note 1 Set and Probability Theory MIT 14.30 Spring 2006 Herman Bennett 1 Set Theory 1.1 Definitions and Theorems 1. Experiment: any action or process whose outcome is subject to uncertainty. 2.
More informationMarkov random fields and Gibbs measures
Chapter Markov random fields and Gibbs measures 1. Conditional independence Suppose X i is a random element of (X i, B i ), for i = 1, 2, 3, with all X i defined on the same probability space (.F, P).
More informationCombinatorics: The Fine Art of Counting
Combinatorics: The Fine Art of Counting Week 7 Lecture Notes Discrete Probability Continued Note Binomial coefficients are written horizontally. The symbol ~ is used to mean approximately equal. The Bernoulli
More informationIntroduction to Probability
Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence
More informationStochastic Processes and Advanced Mathematical Finance. Laws of Large Numbers
Steven R. Dunbar Department of Mathematics 203 Avery Hall University of NebraskaLincoln Lincoln, NE 685880130 http://www.math.unl.edu Voice: 4024723731 Fax: 4024728466 Stochastic Processes and Advanced
More informationNotes on Probability Theory
Notes on Probability Theory Christopher King Department of Mathematics Northeastern University July 31, 2009 Abstract These notes are intended to give a solid introduction to Probability Theory with a
More informationMath/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability
Math/Stats 425 Introduction to Probability 1. Uncertainty and the axioms of probability Processes in the real world are random if outcomes cannot be predicted with certainty. Example: coin tossing, stock
More informationMeasure Theory. 1 Measurable Spaces
1 Measurable Spaces Measure Theory A measurable space is a set S, together with a nonempty collection, S, of subsets of S, satisfying the following two conditions: 1. For any A, B in the collection S,
More informationLecture 3: Continuous distributions, expected value & mean, variance, the normal distribution
Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution 8 October 2007 In this lecture we ll learn the following: 1. how continuous probability distributions differ
More informationIntroduction to Hypothesis Testing. Point estimation and confidence intervals are useful statistical inference procedures.
Introduction to Hypothesis Testing Point estimation and confidence intervals are useful statistical inference procedures. Another type of inference is used frequently used concerns tests of hypotheses.
More informationMathematical Methods of Engineering Analysis
Mathematical Methods of Engineering Analysis Erhan Çinlar Robert J. Vanderbei February 2, 2000 Contents Sets and Functions 1 1 Sets................................... 1 Subsets.............................
More informationProbability and statistics; Rehearsal for pattern recognition
Probability and statistics; Rehearsal for pattern recognition Václav Hlaváč Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics Center for Machine Perception
More informationEstimating the Average Value of a Function
Estimating the Average Value of a Function Problem: Determine the average value of the function f(x) over the interval [a, b]. Strategy: Choose sample points a = x 0 < x 1 < x 2 < < x n 1 < x n = b and
More informationP (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )
Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =
More informationAn Introduction to Basic Statistics and Probability
An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random
More informationProbability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom
Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of sample space, event and probability function. 2. Be able to
More informationExploratory Data Analysis
Exploratory Data Analysis Johannes Schauer johannes.schauer@tugraz.at Institute of Statistics Graz University of Technology Steyrergasse 17/IV, 8010 Graz www.statistics.tugraz.at February 12, 2008 Introduction
More informationBasics of Probability
Basics of Probability August 27 and September 1, 2009 1 Introduction A phenomena is called random if the exact outcome is uncertain. The mathematical study of randomness is called the theory of probability.
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationThe Ergodic Theorem and randomness
The Ergodic Theorem and randomness Peter Gács Department of Computer Science Boston University March 19, 2008 Peter Gács (Boston University) Ergodic theorem March 19, 2008 1 / 27 Introduction Introduction
More informationMeasure Theory. Jesus FernandezVillaverde University of Pennsylvania
Measure Theory Jesus FernandezVillaverde University of Pennsylvania 1 Why Bother with Measure Theory? Kolmogorov (1933). Foundation of modern probability. Deals easily with: 1. Continuous versus discrete
More information4. Continuous Random Variables, the Pareto and Normal Distributions
4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random
More information3. The Multivariate Normal Distribution
3. The Multivariate Normal Distribution 3.1 Introduction A generalization of the familiar bell shaped normal density to several dimensions plays a fundamental role in multivariate analysis While real data
More informationProbability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur
Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special DistributionsVI Today, I am going to introduce
More informationRi and. i=1. S i N. and. R R i
The subset R of R n is a closed rectangle if there are n nonempty closed intervals {[a 1, b 1 ], [a 2, b 2 ],..., [a n, b n ]} so that R = [a 1, b 1 ] [a 2, b 2 ] [a n, b n ]. The subset R of R n is an
More informationALMOST COMMON PRIORS 1. INTRODUCTION
ALMOST COMMON PRIORS ZIV HELLMAN ABSTRACT. What happens when priors are not common? We introduce a measure for how far a type space is from having a common prior, which we term prior distance. If a type
More informationNPTEL STRUCTURAL RELIABILITY
NPTEL Course On STRUCTURAL RELIABILITY Module # 02 Lecture 6 Course Format: Web Instructor: Dr. Arunasis Chakraborty Department of Civil Engineering Indian Institute of Technology Guwahati 6. Lecture 06:
More informationIEOR 4106: Introduction to Operations Research: Stochastic Models. SOLUTIONS to Homework Assignment 1
IEOR 4106: Introduction to Operations Research: Stochastic Models SOLUTIONS to Homework Assignment 1 Probability Review: Read Chapters 1 and 2 in the textbook, Introduction to Probability Models, by Sheldon
More informationMATHEMATICAL METHODS OF STATISTICS
MATHEMATICAL METHODS OF STATISTICS By HARALD CRAMER TROFESSOK IN THE UNIVERSITY OF STOCKHOLM Princeton PRINCETON UNIVERSITY PRESS 1946 TABLE OF CONTENTS. First Part. MATHEMATICAL INTRODUCTION. CHAPTERS
More informationNOTES ON MEASURE THEORY. M. Papadimitrakis Department of Mathematics University of Crete. Autumn of 2004
NOTES ON MEASURE THEORY M. Papadimitrakis Department of Mathematics University of Crete Autumn of 2004 2 Contents 1 σalgebras 7 1.1 σalgebras............................... 7 1.2 Generated σalgebras.........................
More informationFinite and discrete probability distributions
8 Finite and discrete probability distributions To understand the algorithmic aspects of number theory and algebra, and applications such as cryptography, a firm grasp of the basics of probability theory
More informationA Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails
12th International Congress on Insurance: Mathematics and Economics July 1618, 2008 A Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails XUEMIAO HAO (Based on a joint
More informationMath 431 An Introduction to Probability. Final Exam Solutions
Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <
More informationChapter ML:IV. IV. Statistical Learning. Probability Basics Bayes Classification Maximum aposteriori Hypotheses
Chapter ML:IV IV. Statistical Learning Probability Basics Bayes Classification Maximum aposteriori Hypotheses ML:IV1 Statistical Learning STEIN 20052015 Area Overview Mathematics Statistics...... Stochastics
More informationSequences and Convergence in Metric Spaces
Sequences and Convergence in Metric Spaces Definition: A sequence in a set X (a sequence of elements of X) is a function s : N X. We usually denote s(n) by s n, called the nth term of s, and write {s
More informationCHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.
Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,
More information