Probability for Estimation (review)



Similar documents
ST 371 (IV): Discrete Random Variables

E3: PROBABILITY AND STATISTICS lecture notes

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

Introduction to Probability

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Probability Distribution for Discrete Random Variables

Probability Theory. Florian Herzog. A random variable is neither random nor variable. Gian-Carlo Rota, M.I.T..

Probability and statistics; Rehearsal for pattern recognition

Joint Exam 1/P Sample Exam 1

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Lecture 6: Discrete & Continuous Probability and Random Variables

An Introduction to Basic Statistics and Probability

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

Martingale Ideas in Elementary Probability. Lecture course Higher Mathematics College, Independent University of Moscow Spring 1996

Lecture Note 1 Set and Probability Theory. MIT Spring 2006 Herman Bennett

Statistics in Geophysics: Introduction and Probability Theory

Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability

A Tutorial on Probability Theory

Chapter 4 Lecture Notes

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers)

Math 151. Rumbos Spring Solutions to Assignment #22

Probability & Probability Distributions

Lecture 4: BK inequality 27th August and 6th September, 2007

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS INTRODUCTION

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables

Lecture 9: Introduction to Pattern Analysis

5. Continuous Random Variables

The Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University

Linear Threshold Units

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

2.6. Probability. In general the probability density of a random variable satisfies two conditions:

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Basic Probability Concepts

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

M2S1 Lecture Notes. G. A. Young ayoung

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh

Chapter 5. Random variables

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Probability and Random Variables. Generation of random variables (r.v.)

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

RANDOM INTERVAL HOMEOMORPHISMS. MICHA L MISIUREWICZ Indiana University Purdue University Indianapolis

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

ECE302 Spring 2006 HW5 Solutions February 21,

Notes on Continuous Random Variables

1. Prove that the empty set is a subset of every set.

University of California, Los Angeles Department of Statistics. Random variables

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Lecture 7: Continuous Random Variables

Summer School in Statistics for Astronomers & Physicists, IV June 9-14, 2008

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Chapter 4. Probability and Probability Distributions

Section 5.1 Continuous Random Variables: Introduction

), 35% use extra unleaded gas ( A

ANALYZING INVESTMENT RETURN OF ASSET PORTFOLIOS WITH MULTIVARIATE ORNSTEIN-UHLENBECK PROCESSES

A SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS

How To Understand And Solve A Linear Programming Problem

Lecture 8. Confidence intervals and the central limit theorem

Statistics 100A Homework 8 Solutions

Statistical Machine Learning

Estimating the Degree of Activity of jumps in High Frequency Financial Data. joint with Yacine Aït-Sahalia

MIMO CHANNEL CAPACITY

Sample Space and Probability

Lecture Notes 1. Brief Review of Basic Probability

Elements of probability theory

Machine Learning.

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

SOLUTIONS TO EXERCISES FOR. MATHEMATICS 205A Part 3. Spaces with special properties

MAS108 Probability I

INTRODUCTORY SET THEORY

Random Variables and Probability

Some probability and statistics

Pattern Analysis. Logistic Regression. 12. Mai Joachim Hornegger. Chair of Pattern Recognition Erlangen University

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

1. A survey of a group s viewing habits over the last year revealed the following

Part III. Lecture 3: Probability and Stochastic Processes. Stephen Kinsella (UL) EC4024 February 8, / 149

CS 688 Pattern Recognition Lecture 4. Linear Models for Classification

STA 256: Statistics and Probability I

Hypothesis Testing for Beginners

Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Normal Distribution as an Approximation to the Binomial Distribution

Lecture 8. Generating a non-uniform probability distribution

Extension of measure

Math 431 An Introduction to Probability. Final Exam Solutions

4 Sums of Random Variables

CCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York

Lectures on Stochastic Processes. William G. Faris

Math Assignment 6

Probability Calculator

Bayes Theorem. Bayes Theorem- Example. Evaluation of Medical Screening Procedure. Evaluation of Medical Screening Procedure

Master s thesis tutorial: part III

Transcription:

Probability for Estimation (review) In general, we want to develop an estimator for systems of the form: x = f x, u + η(t); y = h x + ω(t); ggggg y, ffff x We will primarily focus on discrete time linear systems x k+1 = A k x k + B k u k + η k ; y k = H k x k + ω k ; Where A k, B k, H k are constant matrices x k is the state at time t k ; u k is the control at time t k η k,ω k are disturbances at time t k Goal: develop procedure to model disturbances for estimation Kolmogorov probability, based on axiomatic set theory 1930 s onward

Axioms of Set-Based Probability Probability Space: Let Ω be a set of experimental outcomes (e.g., roll of dice) Ω = A 1, A 2,, A N the A i are elementary events and subsets of Ω are termed events Empty set { } is the impossible event S={Ω} is the certain event A probability space (Ω, F,P) F = set of subsets of Ω, or events, P assigns probabilities to events Probability of an Event the Key Axioms: Assign to each A i a number, P(A i ), termed the probability of event A i P(A i ) must satisfy these axioms 1. P(A i ) 0 2. P(S) = 1 3. If events A,B ϵ Ω are mutually exclusive, or disjoint, elements or events (A B= { }), then P A B = P A + P(B)

Axioms of Set-Based Probability As a result of these three axioms and basic set operations (e.g., DeMorgan s laws, such as A B=A B) P({ })=0 P(A) = 1-P(A) P(A) + P(A) = 1, where A is complement of A If A 1, A 2,, A N mutually disjoint P A 1 A 1 A N = P A 1 + P A 1 + + P(A N ) For Ω an infinite, but countable, set we add the Axiom of infinite additivity 3(b). If A 1, A 2, are mutually exclusive, P A 1 A 1 = P A 1 + P A 1 + We assume that all countable sets of events satisfy Axioms 1, 2, 3, 3(b) But we need to model uncountable sets

Continuous Random Variables (CRVs) Let Ω = R (an uncountable set of events) Problem: it is not possible to assign probabilities to subsets of R which satisfy the above Axioms Solution: let events be intervals of R: A = {x x l x x u }, and their countable unions and intersections. Assign probabilities to these events P x l x x u = PPPPPPiiiii thpt x ttttt vvvvvv ii [x l, x u ] x is a continuous random variable (CRV). Some basic properties of CRVs If x is a CRV in L, U, then P(L x L) = 1 If y in L, U, then P(L y x) = 1 - P(y x U)

Probability Density Function (pdf) E.g. Uniform Probability pdf: p x = 1 b a x u p x l x x u p x dd x l p x Gaussian (Normal) pdf: p x = 1 σ 2π e 1 2 x μ σ 2 µ = mean of pdf σ = standard deviation p x Most of our Estimation theory will be built on the Gaussian distribution

Joint & Conditional Probability Joint Probability: Countable set of events: P A B = P(A,B), probability A & B both occur CRVs: let x,y be two CRVs defined on the same probability space. Their joint probability density function p(x,y) is defined as: P x l x x u ; y l y y u p x, y dd Independence A, B are independent if P(A,B) = P(A) P(B) x,y are independent if p(x,y) = p(x) p(y) Conditional Probability: Countable events: P(A B)= P A B P B y l y u x l x u dd, probability of A given that B occurred E.G. probability that a 2 is rolled on a fair die given that we know the roll is even: P(B) = probability of even roll = 3/6=1/2 P A B = 1/6 (since A B = A) P(2 even roll) = P A B /P(B) = (1/6)/(1/2) = 1/3

Conditional Probability & Expectation Conditional Probability (continued): CRV: p x y p x,y = p y ii 0 < p y 0 PthgPoooo This follows from: and: x u < P x l x x u y p x y dd p x = p x, y dy x l = x u p x, y dd x l p(y) = p x y p(y)dd Expectation: (key for estimation) Let x be a CRV with distrubution p(x). The expected value (or mean) of x is E[x] = xp x dx E[g(x)] = g(x)p x dx Conditional mean (conditional expected value) of x given event M: E[x M] = xp x M dx

Expectation (continued) Mean Square: Variance: E[x 2 ] = σ 2 = E[(x μ) 2 ] = x 2 p x dx (x μ) 2 p x dx μ x = E[x]

Random Processes A stochastic system whose state is characterized a time evolving CRV, x(t), t ε [0,T]. At each t, x(t) is a CRV x(t) is the state of the random process, which can be characterized by P[x l x(t) x u ] = p x, t dx Random Processes can also be characterized by: Joint probability function Joint probability density function P[x 1l x(t 1 ) x 1u ; x 2l x(t 2 ) x 2u ] = p x 1, x 2, t 1, t 2 dd 1 dd 2 x 1u x 1l x 2u x 2l Correlation Function Correlation function E[x t 1 x(t 2 )] = x 1 x 2 p x 1, x 2, t 1, t 2 dd 1 dd 2 ρ(t 1, t 2 ) A random process x(t) is Stationary if p(x,t+τ)=p(x,t) for all τ

Vector Valued Random Processes X (t) = X 1 (t) X n (t) where each X i (t) is a random process R(t 1, t 2 ) = Correlation Matrix = E[X t 1 X T t 2 ] = E[X 1 t 1 X 1 t 2 ] E[X 1 t 1 X n t 2 ] E[X n t 1 X 1 t 2 ] E[X n t 1 X n t 2 ] (t) = Covariance Matrix = E (X t μ t )(X t μ t ) T Where μ t = E[X t ]