Introduction to Kalman Filtering



Similar documents
Kalman and Extended Kalman Filters: Concept, Derivation and Properties

Understanding and Applying Kalman Filtering

Sensorless Control of a Brushless DC motor using an Extended Kalman estimator.

Kristine L. Bell and Harry L. Van Trees. Center of Excellence in C 3 I George Mason University Fairfax, VA , USA kbell@gmu.edu, hlv@gmu.

Kalman Filter Applied to a Active Queue Management Problem

Linear-Quadratic Optimal Controller 10.3 Optimal Linear Control Systems

Data Mining: Algorithms and Applications Matrix Math Review

Linear Threshold Units

Multivariate Normal Distribution

Master s Theory Exam Spring 2006

CS229 Lecture notes. Andrew Ng

Discrete Frobenius-Perron Tracking

Advanced Signal Processing and Digital Noise Reduction

An Introduction to the Kalman Filter

Christfried Webers. Canberra February June 2015

Master s thesis tutorial: part III

NOV /II. 1. Let f(z) = sin z, z C. Then f(z) : 3. Let the sequence {a n } be given. (A) is bounded in the complex plane

Machine Learning and Pattern Recognition Logistic Regression

Least-Squares Intersection of Lines

Review Jeopardy. Blue vs. Orange. Review Jeopardy

A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking

State Space Time Series Analysis

KALMAN Filtering techniques can be used either for

Visual Vehicle Tracking Using An Improved EKF*

Analysis of Mean-Square Error and Transient Speed of the LMS Adaptive Algorithm

Factor analysis. Angela Montanari

Probability and Random Variables. Generation of random variables (r.v.)

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique

Sections 2.11 and 5.8

Correlation in Random Variables

Mathieu St-Pierre. Denis Gingras Dr. Ing.

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Introduction to General and Generalized Linear Models

A Statistical Framework for Operational Infrasound Monitoring

CCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION

EE 570: Location and Navigation

Time Series Analysis III

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Forecasting "High" and "Low" of financial time series by Particle systems and Kalman filters

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Econometrics Simple Linear Regression

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

LECTURE 4. Last time: Lecture outline

Lecture 3: Linear methods for classification

Robotics. Chapter 25. Chapter 25 1

Adaptive Demand-Forecasting Approach based on Principal Components Time-series an application of data-mining technique to detection of market movement

A Movement Tracking Management Model with Kalman Filtering Global Optimization Techniques and Mahalanobis Distance

A SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS

Adaptive Variable Step Size in LMS Algorithm Using Evolutionary Programming: VSSLMSEV

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

Lecture 8: Signal Detection and Noise Assumption

MIMO CHANNEL CAPACITY

Part 4 fitting with energy loss and multiple scattering non gaussian uncertainties outliers

Formulations of Model Predictive Control. Dipartimento di Elettronica e Informazione

Similarity and Diagonalization. Similar Matrices

2.2 Creaseness operator

Course 8. An Introduction to the Kalman Filter

Subspace Analysis and Optimization for AAM Based Face Alignment

Component Ordering in Independent Component Analysis Based on Data Power

Basics of Statistical Machine Learning

PTE505: Inverse Modeling for Subsurface Flow Data Integration (3 Units)

Univariate and Multivariate Methods PEARSON. Addison Wesley

Two Topics in Parametric Integration Applied to Stochastic Simulation in Industrial Engineering

Dynamic data processing

Lecture 9: Introduction to Pattern Analysis

Vehicle Tracking in Occlusion and Clutter

Deterministic Sampling-based Switching Kalman Filtering for Vehicle Tracking

Real-time Visual Tracker by Stream Processing

The Chinese Restaurant Process

Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not.

Nonlinear Systems and Control Lecture # 15 Positive Real Transfer Functions & Connection with Lyapunov Stability. p. 1/?

Orthogonal Projections

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

by the matrix A results in a vector which is a reflection of the given

Statistical Machine Learning

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

67 Detection: Determining the Number of Sources

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh

Logistic Regression (1/24/13)

Automated Stellar Classification for Large Surveys with EKF and RBF Neural Networks

Variance Reduction. Pricing American Options. Monte Carlo Option Pricing. Delta and Common Random Numbers

SALEM COMMUNITY COLLEGE Carneys Point, New Jersey COURSE SYLLABUS COVER SHEET. Action Taken (Please Check One) New Course Initiated

Lecture 5: Variants of the LMS algorithm

COMPUTER SIMULATION OF REAL TIME IDENTIFICATION FOR INDUCTION MOTOR DRIVES

Background 2. Lecture 2 1. The Least Mean Square (LMS) algorithm 4. The Least Mean Square (LMS) algorithm 3. br(n) = u(n)u H (n) bp(n) = u(n)d (n)

Sales forecasting # 2

STA 4273H: Statistical Machine Learning

Transcription:

Introduction to Kalman Filtering A set of two lectures Maria Isabel Ribeiro Associate Professor Instituto Superior écnico / Instituto de Sistemas e Robótica June All rights reserved

INRODUCION O KALMAN FILERING What is a Kalman Filter? Introduction to the Concept Which is the best estimate? Basic Assumptions Discrete Kalman Filter Problem Formulation From the Assumptions to the Problem Solution owards the Solution Filter dynamics Prediction cycle Filtering cycle Summary Properties of the Discrete KF A simple example he meaning of the error covariance matrix he Extended Kalman Filter M.Isabel Ribeiro - June.

M.Isabel Ribeiro - June. WHA IS A KALMAN FILER? 3 Optimal Recursive Data Processing Algorithm ypical Kalman filter application Controls System System error sources System state (desired but not now Measuring devices Observed measurements Kalman filter Optimal estimate of system state Measurement error sources

M.Isabel Ribeiro - June. WHA IS A KALMAN FILER? Introduction to the Concept 4 Optimal Recursive Data Processing Algorithm Dependent upon the criteria chosen to evaluate performance Under certain assumptions, KF is optimal with respect to virtually any criteria that maes sense. KF incorporates all available information nowledge of the system and measurement device dynamics statistical description of the system noises, measurement errors, and uncertainty in the dynamics models any available information about initial conditions of the variables of interest

WHA IS A KALMAN FILER? Introduction to the concept 5 Optimal Recursive Data Processing Algorithm x( + z( + = f(x(,u(,w( = h(x( +,v( + x - state f - system dynamics h - measurement function u - controls w - system error sources v - measurement error sources z - observed measurements Given f, h, noise characterization, initial conditons z(, z(, z(,, z( Obtain the best estimate of x( M.Isabel Ribeiro - June.

WHA IS A KALMAN FILER? Introduction to the concept 6 Optimal Recursive Data Processing Algorithm the KF does not require all previous data to be ept in storage and reprocessed every time a new measurement is taen. z( z( z(... z( KF xˆ( z ( z( z(... z( z( + KF KF xˆ ( + xˆ( o evaluate the KF only requires xˆ( xˆ ( + and z(+ M.Isabel Ribeiro - June.

WHA IS A KALMAN FILER? Introduction to the concept 7 Optimal Recursive Data Processing Algorithm he KF is a data processing algorithm he KF is a computer program runing in a central processor M.Isabel Ribeiro - June.

M.Isabel Ribeiro - June. WHA IS HE KALMAN FILER? Which is the best estimate? 8 Any type of filter tries to obtain an optimal estimate of desired quantities from data provided by a noisy environment. Best = minimizing errors in some respect. Bayesian viewpoint - the filter propagates the conditional probability density of the desired quantities, conditioned on the nowledge of the actual data coming from measuring devices Why base the state estimation on the conditional probability density function?

WHA IS A KALMAN FILER? Which is the best estimate? 9 Example x(i one dimensional position of a vehicle at time instant i z(j two dimensional vector describing the measurements of position at time j by two separate radars If z(=z, z(=z,., z(j=z j p x(i z(,z(,..., z(i (x z,z,..., z i represents all the information we have on x(i based (conditioned on the measurements acquired up to time i given the value of all measurements taen up time i, this conditional pdf indicates what the probability would be of x(i assuming any particular value or range of values. M.Isabel Ribeiro - June.

WHA IS A KALMAN FILER? Which is the best estimate? he shape of px(i z(,z(,..., z(i (x z,z,..., z i conveys the amount of certainty we have in the nowledge of the value x. p(x z,z,..., z i Based on this conditional pdf, the estimate can be: the mean - the center of probability mass (MMSE the mode - the value of x that has the highest probability (MAP the median - the value of x such that half the probability weight lies to the left and half to the right of it. M.Isabel Ribeiro - June. x

WHA IS HE KALMAN FILER? Basic Assumptions he Kalman Filter performs the conditional probability density propagation for systems that can be described through a LINEAR model in which system and measurement noises are WHIE and GAUSSIAN Under these assumptions, the conditional pdf is Gaussian mean=mode=median there is a unique best estimate of the state the KF is the best filter among all the possible filter types What happens if these assumptions are relaxed? Is the KF still an optimal filter? In which class of filters? M.Isabel Ribeiro - June.

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation MOIVAION Given a discrete-time, linear, time-varying plant with random initial state driven by white plant noise Given noisy measurements of linear combinations of the plant state variables Determine the best estimate of the system state variable SAE DYNAMICS AND MEASUREMEN EQUAION x + z = = A C x x + B + v u + G w,

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation 3 VARIABLE DEFINIIONS x u w v z R R R R R n m n r r state vector (stochastic non - white process deterministic input sequence white Gaussian system noise (assumed with zero mean white Gaussian measurement noise (assumed with zero mean measuremen t vector (stochastic non - white sequence

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation 4 INIIAL CONDIIONS x is a Gaussian random vector, with mean covariance matrix E [x] = x SAE AND MEASUREMEN NOISE zero mean E[w ]=E[v ]= {w }, {v } - white Gaussian sequences w E v w v E[(x x (x x ] = P = P = Q x(, w and v j are independent for all and j R

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation 5 DEFINIION OF FILERING PROBLEM Let denote present value of time Given the sequence of past inputs U = {u,u,...u } Given the sequence of past measurements Z {z,z,...z } = Evaluate the best estimate of the state x(

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation 6 Given x Nature apply w We apply u x+ = Ax + Bu + Gw, z = Cx + v he system moves to state x We mae a measurement z Question: which is the best estimate of x? Nature apply w We apply u he system moves to state x p(x Z Answer: obtained from We mae a measurement z Question: which is the best estimate of x?... Answer: obtained from p(x Z

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation 7... Question: which is the best estimate of x -? p(x Z Answer: obtained from Nature apply w - We apply u - he system moves to state x We mae a measurement z Question: which is the best estimate of x?... Answer: obtained from p(x Z

M.Isabel Ribeiro - June. DISCREE KALMAN FILER owards the Solution 8 he filter has to propagate the conditional probability density functions p(x p(x Z xˆ( p(x Z. p(x Z xˆ(. xˆ ( p(x. Z xˆ(.

M.Isabel Ribeiro - June. DISCREE KALMAN FILER From the Assumptions to the Problem Solution 9 he LINEARIY of the system state equation the system observation equation he GAUSSIAN nature of the initial state, x the system white noise, w the measurement white noise, v p(x Z is Gaussian Uniquely characterized by the conditional mean xˆ ( E[x Z = ] the conditional covariance P( cov[x ;x Z = ] p(x Z ~ Ν(xˆ(,P(

M.Isabel Ribeiro - June. DISCREE KALMAN FILER owards the Solution As the conditional probability density functions are Gaussian, the Kalman filter only propagates the first two moments p(x p(x p(x Z E[x Z ] = xˆ( P( p(x Z. p(x Z E[x Z ] = xˆ(. P( E[x Z ] = xˆ( P(. p(x Z E[x Z ] = xˆ( P(...

M.Isabel Ribeiro - June. DISCREE KALMAN FILER owards the Solution We stated that the state estimate equals the conditional mean xˆ ( = E[x Z ] Why? p(x Z Why not the mode of? Why not the median of p(x Z? p(x Z mean = mode = median As is Gaussian

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics KF dynamics is recursive Z = {z,z,...,z } U = {u,u,...,u } p(x Z + Z = {Z,z } + U {U =,u } p(x Z + + Prediction cycle What can you say about x + before we mae the measurement z + p(x Z + Z = {z,z,...,z } U = {U,u } Filtering cycle How can we improve our information on x + after we mae the measurement z +

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics 3 p(x p(x Z p(x Z prediction p(x U filtering filtering prediction p(x Z p(x Z + + + p(x + Z p(x + Z prediction filtering p(x Z + prediction p(x + Z + filtering

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics - Prediction cycle 4 Prediction cycle p(x p(x Z Z ~ Ν(xˆ(,P( assumed nown p(x Z +? Is Gaussian xˆ( + E(x Z = +? P( + = cov[x ;x Z + + ]? x+ = E[x + Ax + Bu + Gw Z ] A E[x Z = ] + BE[u Z ] + GE[w Z ] xˆ ( + = Axˆ( + Bu

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics - Prediction cycle 5 Prediction cycle P( + = cov[x ;x Z + + ] ~ x( ~ x( + = x+ xˆ( + + x( + xˆ( + = = A ~ x( + Gw A prediction error x + Bu + Gw (Axˆ( + Bu P( + = E[ ~ x( + ~ x( + Z ] cov[y;y] = E[(y y(y y ] P( + = A P( A + G Q G

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics - Filtering cycle 6 Filtering cycle p(x Z + Ν( xˆ( +,P( + z p(x Z + + + +? º Passo Measurement prediction What can you say about z + before we mae the measurement z + p(z + Z p(c + x+ + v+ Z E[z + Z ] = ẑ( + = C+ xˆ( + + + = + = + + cov[z ;z Z ] Pz ( C P( C + R + +

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics - Filtering cycle 7 Filtering cycle º Passo p(x + Z + E[x + Z ] = E[x + Z,z+ ] + Z ~ e {Z, z( + } São equivalentes do ponto de vista de infirmação contida + E[x Z ] E[x Z, ~ + = + z( + ] If x, y and z are jointly Gaussian and y and z are statistically independent E[x Required result y,z] = E[x y] + E[x z] m x

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics - Filtering cycle 8 Filtering cycle xˆ ( + + = xˆ( + + P( + C C P( C R (z C xˆ( + + + + + + + + + K( + ẑ( + measurement prediction xˆ ( Kalman Gain + = xˆ( + + K( + (z+ C xˆ( + + + ~ z( + P( + + = P( + P( + C C P( C R + + + C P( + + + + +

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Dynamics 9 Linear System x+ = Ax + Bu + Gw, z = Cx + v Discrete Kalman Filter prediction xˆ ( + = Axˆ( + Bu P( + = A P( A + GQG P( xˆ ( + + = xˆ( + + K( + (z + C + xˆ( + + + = P( + P( + C C P( + C + R C P( + + + + + + K( + = P( + C C + + + + P( C R + + filtering Initial conditions ( x xˆ = P ( = P

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Properties 3 he Discrete KF is a time-varying linear system xˆ + + = (I K+ C + Axˆ + K+ z + + Bu even when the system is time-invariant and has stationary noise xˆ + + = (I K+ CAxˆ + K+ z + + Bu the Kalman gain is not constant Does the Kalman gain matrix converges to a constant matrix? In which conditions?

DISCREE KALMAN FILER Properties 3 he state estimate is a linear function of the measurements KF dyamics in terms of the filtering estimate xˆ + + = (I K+ C + Axˆ + K+ z + + Bu Φ xˆ = x Assuming null inputs for the sae of simplicity xˆ = Φxˆ + Kz xˆ = ΦΦ xˆ + ΦKz + Kz xˆ 3 3 = ΦΦΦ xˆ + ΦΦKz + ΦKz + K3z3 M.Isabel Ribeiro - June.

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Properties 3 Innovation process r + = z + C + xˆ( + xˆ ( + = E(x + Z z(+ carries information on x(+ that was not available on this new information is represented by r(+ - innovation process? Z Properties of the innovation process the innovations r( are orthogonal to z(i E[r(z (i] =, i =,,..., the innovations are uncorrelated/white noise E[r(r (i] =, i this test can be used to acess if the filter is operating correctly

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Properties 33 Covariance matrix of the innovation process S( + = C + + P(K K C + + R + K( + = + + + P( C + + + C P( C R+ K( + = P( + + C S +

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Properties 34 he Discrete KF provides an unbiased estimate of the state xˆ + + is an unbiased estimate of the state x(+, providing that the initial conditions are P ( = P xˆ ( = x Is this still true if the filter initial conditions are not the specified?

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Steady state Kalman Filter 35 ime invariant system and stationay white system and observation noise x+ = Ax + Gw, z = Cx + v E[w w ] Q = E[v v ] = R Filter dynamics xˆ ( + + = Axˆ( + + K( + (z + Cxˆ( + P( + = AP( -A AP( C [CP( C + R] CP( A + GQG Discrete Riccati Equation K(

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Steady state Kalman Filter 36 If Q is positive definite, (A,G Q is controllable, and (A,C is observable, then the steady state Kalman filter exists the limit exists lim P( + = P P is the unique, finite positive-semidefinite solution to the algebraic equation - - - - P = AP A AP C [CP C + R] CP - A + GQG P is independent of P provided that P the steady-state Kalman filter is assymptotically unbiased K = P - C [CP - C + R]

M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf 37 Let z be a Gaussian random vector of dimension n [ ] = P [] = m, E ( z m( z m E z P - covariance matrix - symetric, positive defined Probability density function p(z = ( π exp n detp ( z m P (z m n= n=

M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf 38 Locus of points where the fdp is greater or equal than a given threshold (z m P (z m K n= line segment n= ellipse and inner points n=3 3D ellipsoid and inner points n>3 hiperellipsoid and inner points If P= diag( σ, σ,!, σn the ellipsoid axis are aligned with the axis of the referencial where the vector z is defined n (z m (z m P (z m K i i σ K length of the ellipse semi-axis = σ i i= K i

M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf - Error elipsoid 39 P = σ σ Example n= P = σ σ σ σ

M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf -Error ellipsoid and axis orientation 4 Error ellipsoid ( z mz P (z mz K P=P - to distinct eigenvalues correspond orthogonal eigenvectors Assuming that P is diagonalizable P = D with D = diag( λ, λ,!, = I λn Error ellipoid (after coordinate transformation w = z (z m z (w m w D D (z m (w m w z K K At the new coordinate system, the ellipsoid axis are aligned with the axis of the new referencial

4 M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf -Error elipsis and referencial axis n= [ ] K m y m x m y m x y x y x y x σ σ K m (y K m (x y y x x σ + σ m x m y K σ x K σ y ellipse = y x z

M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf -Error ellipse and referencial axis 4 n= x z = y λ λ = = σ x = ρσ x σ y ρσ xσy σ y σ x ρσxσy ρσxσ y σ y x y P [ x y] K [ ] σ x + σ y + ( σ x σ y + 4ρ σ x σ y [ ] σ + σ ( σ σ + 4ρ σ σ x y x y x y y α = tan w Kλ ρσ σ σ + x y x σ y w Kλ, w K λ α K λ w x π 4 α π, 4 σ x σ y

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Probabilistic interpretation of the error ellipsoid 43 p(x Z ~ Ν(xˆ(,P( Given xˆ ( and P( it is possible to define the locus where, with a given probability, the values of the random vector x( ly. Hiperellipsoid with center in xˆ( and with semi-axis proportional to the eigenvalues of P(

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Probabilistic interpretation of the error ellipsoid 44 p(x Z ~ Ν(xˆ(,P( Example for n= M = {x : [x xˆ( ] P( [x xˆ( ] K} x xˆ( Pr{ x M} is a function of K a pre-specified values of this probability can be obtained by an apropriate choice of K

M.Isabel Ribeiro - June. DISCREE KALMAN FILER Probabilistic interpretation of the error ellipsoid 45 x R n p(x Z [ x xˆ( ] P( [x ~ Ν(xˆ(,P( xˆ( ] K (Scalar random variable with a χ distribution with n degrees of reedom How to chose K for a desired probability? Just consult a Chi square distribution table Probability = 9% n= K=.7 n= K=4.6 Probability = 95% n= K=3.84 n= K=5.99

M.Isabel Ribeiro - June. DISCREE KALMAN FILER he error ellipsoid and the filter dynamics 46 Prediction cycle x xˆ( P( xˆ ( + P( + u x + Q w x + = A x + Bu + G w xˆ ( + = A xˆ( + Bu P( + = A P( A + G QG

M.Isabel Ribeiro - June. DISCREE KALMAN FILER he error ellipsoid and the filter dynamics 47 Filtering cycle P( + xˆ ( + + = xˆ( + + K( + r( + P( + + = P( + K( + C + P( + x + xˆ ( + xˆ ( + + x + C + xˆ( + S( + P( + + r + R v + r + z + z + = C + x + + v + S( r + = z + C + xˆ( + + = C + P(K + K C + + R +

M.Isabel Ribeiro - June. Extended Kalman Filter 48 Non linear dynamics White Gaussian system and observation noise x + z = = f h (x (x,u + + v w x ~ N(x,P E[w w j ] = Q δ j E[v v j ] = R δ j QUESION: Which is the MMSE (minimum mean-square error estimate of x(+? Conditional mean xˆ ( + = E(x + Z Due to the non-linearity of the system, p(x are non Gaussian Z p(x + Z?

Extended Kalman Filter 49 (Optimal ANSWER: he MMSE estimate is given by a non-linear filter, that propagates the conditonal pdf. he EKF gives an approximation of the optimal estimate he non-linearities are approximated by a linearized version of the non-linear model around the last state estimate. For this approximation to be valid, this linearization should be a good approximation of the non-linear model in all the unceratinty domain associated with the state estimate. M.Isabel Ribeiro - June.

M.Isabel Ribeiro - June. Extended Kalman Filter 5 p(x Z xˆ( linearize x + = f (x,u + w Apply KF to the linear dynamics around xˆ( p(x Z + xˆ ( + Apply KF to the linear dynamics + = + linearize z h+ (x+ v + around xˆ ( + p(x Z + + xˆ ( + +

M.Isabel Ribeiro - June. Extended Kalman Filter 5 linearize around x = f (x,u + + xˆ( w f (x,u f (xˆ,u + f (x xˆ +... x+ f x + w + (f (xˆ,u f xˆ Prediction cycle of KF nown input xˆ + = f xˆ + (f (xˆ,u f xˆ P ( + = f P( f + Q

M.Isabel Ribeiro - June. Extended Kalman Filter 5 linearize z+ = h+ (x+ + v+ around xˆ ( + h + (x+ h+ (xˆ + + h+ (x+ xˆ + +... z+ h+ x+ + v + (h+ (xˆ + h+ xˆ + Update cycle of KF nown input xˆ xˆ P( h ( h P( h R + + = + + + + [z h (xˆ ] + + + + + + + + P( + + = P( + P( + h [ h P( + + + h + + R+ ] h P( + +

M.Isabel Ribeiro - June. References 53 Anderson, Moore, Optimal Filtering, Prentice-Hall, 979. M. Athans, Dynamic Stochastic Estimation, Prediction and Smoothing, Series of Lectures, Spring 999. E. W. Kamen, J. K. Su, Introduction to Optimal Estimation, Springer, 999. Peter S. Maybec, he Kalman Filter: an Introduction to Concepts Jerry M. Mendel, Lessons in Digital Estimation heory, Prentice-Hall, 987. M.Isabel Ribeiro, Notas Dispersas sobre Filtragem de Kalman CAPS, IS, June 989 (http://www.isr.ist.utl.pt/~mir