Introduction to Kalman Filtering

Size: px
Start display at page:

Download "Introduction to Kalman Filtering"

Transcription

1 Introduction to Kalman Filtering A set of two lectures Maria Isabel Ribeiro Associate Professor Instituto Superior écnico / Instituto de Sistemas e Robótica June All rights reserved

2 INRODUCION O KALMAN FILERING What is a Kalman Filter? Introduction to the Concept Which is the best estimate? Basic Assumptions Discrete Kalman Filter Problem Formulation From the Assumptions to the Problem Solution owards the Solution Filter dynamics Prediction cycle Filtering cycle Summary Properties of the Discrete KF A simple example he meaning of the error covariance matrix he Extended Kalman Filter M.Isabel Ribeiro - June.

3 M.Isabel Ribeiro - June. WHA IS A KALMAN FILER? 3 Optimal Recursive Data Processing Algorithm ypical Kalman filter application Controls System System error sources System state (desired but not now Measuring devices Observed measurements Kalman filter Optimal estimate of system state Measurement error sources

4 M.Isabel Ribeiro - June. WHA IS A KALMAN FILER? Introduction to the Concept 4 Optimal Recursive Data Processing Algorithm Dependent upon the criteria chosen to evaluate performance Under certain assumptions, KF is optimal with respect to virtually any criteria that maes sense. KF incorporates all available information nowledge of the system and measurement device dynamics statistical description of the system noises, measurement errors, and uncertainty in the dynamics models any available information about initial conditions of the variables of interest

5 WHA IS A KALMAN FILER? Introduction to the concept 5 Optimal Recursive Data Processing Algorithm x( + z( + = f(x(,u(,w( = h(x( +,v( + x - state f - system dynamics h - measurement function u - controls w - system error sources v - measurement error sources z - observed measurements Given f, h, noise characterization, initial conditons z(, z(, z(,, z( Obtain the best estimate of x( M.Isabel Ribeiro - June.

6 WHA IS A KALMAN FILER? Introduction to the concept 6 Optimal Recursive Data Processing Algorithm the KF does not require all previous data to be ept in storage and reprocessed every time a new measurement is taen. z( z( z(... z( KF xˆ( z ( z( z(... z( z( + KF KF xˆ ( + xˆ( o evaluate the KF only requires xˆ( xˆ ( + and z(+ M.Isabel Ribeiro - June.

7 WHA IS A KALMAN FILER? Introduction to the concept 7 Optimal Recursive Data Processing Algorithm he KF is a data processing algorithm he KF is a computer program runing in a central processor M.Isabel Ribeiro - June.

8 M.Isabel Ribeiro - June. WHA IS HE KALMAN FILER? Which is the best estimate? 8 Any type of filter tries to obtain an optimal estimate of desired quantities from data provided by a noisy environment. Best = minimizing errors in some respect. Bayesian viewpoint - the filter propagates the conditional probability density of the desired quantities, conditioned on the nowledge of the actual data coming from measuring devices Why base the state estimation on the conditional probability density function?

9 WHA IS A KALMAN FILER? Which is the best estimate? 9 Example x(i one dimensional position of a vehicle at time instant i z(j two dimensional vector describing the measurements of position at time j by two separate radars If z(=z, z(=z,., z(j=z j p x(i z(,z(,..., z(i (x z,z,..., z i represents all the information we have on x(i based (conditioned on the measurements acquired up to time i given the value of all measurements taen up time i, this conditional pdf indicates what the probability would be of x(i assuming any particular value or range of values. M.Isabel Ribeiro - June.

10 WHA IS A KALMAN FILER? Which is the best estimate? he shape of px(i z(,z(,..., z(i (x z,z,..., z i conveys the amount of certainty we have in the nowledge of the value x. p(x z,z,..., z i Based on this conditional pdf, the estimate can be: the mean - the center of probability mass (MMSE the mode - the value of x that has the highest probability (MAP the median - the value of x such that half the probability weight lies to the left and half to the right of it. M.Isabel Ribeiro - June. x

11 WHA IS HE KALMAN FILER? Basic Assumptions he Kalman Filter performs the conditional probability density propagation for systems that can be described through a LINEAR model in which system and measurement noises are WHIE and GAUSSIAN Under these assumptions, the conditional pdf is Gaussian mean=mode=median there is a unique best estimate of the state the KF is the best filter among all the possible filter types What happens if these assumptions are relaxed? Is the KF still an optimal filter? In which class of filters? M.Isabel Ribeiro - June.

12 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation MOIVAION Given a discrete-time, linear, time-varying plant with random initial state driven by white plant noise Given noisy measurements of linear combinations of the plant state variables Determine the best estimate of the system state variable SAE DYNAMICS AND MEASUREMEN EQUAION x + z = = A C x x + B + v u + G w,

13 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation 3 VARIABLE DEFINIIONS x u w v z R R R R R n m n r r state vector (stochastic non - white process deterministic input sequence white Gaussian system noise (assumed with zero mean white Gaussian measurement noise (assumed with zero mean measuremen t vector (stochastic non - white sequence

14 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation 4 INIIAL CONDIIONS x is a Gaussian random vector, with mean covariance matrix E [x] = x SAE AND MEASUREMEN NOISE zero mean E[w ]=E[v ]= {w }, {v } - white Gaussian sequences w E v w v E[(x x (x x ] = P = P = Q x(, w and v j are independent for all and j R

15 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation 5 DEFINIION OF FILERING PROBLEM Let denote present value of time Given the sequence of past inputs U = {u,u,...u } Given the sequence of past measurements Z {z,z,...z } = Evaluate the best estimate of the state x(

16 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation 6 Given x Nature apply w We apply u x+ = Ax + Bu + Gw, z = Cx + v he system moves to state x We mae a measurement z Question: which is the best estimate of x? Nature apply w We apply u he system moves to state x p(x Z Answer: obtained from We mae a measurement z Question: which is the best estimate of x?... Answer: obtained from p(x Z

17 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Problem Formulation 7... Question: which is the best estimate of x -? p(x Z Answer: obtained from Nature apply w - We apply u - he system moves to state x We mae a measurement z Question: which is the best estimate of x?... Answer: obtained from p(x Z

18 M.Isabel Ribeiro - June. DISCREE KALMAN FILER owards the Solution 8 he filter has to propagate the conditional probability density functions p(x p(x Z xˆ( p(x Z. p(x Z xˆ(. xˆ ( p(x. Z xˆ(.

19 M.Isabel Ribeiro - June. DISCREE KALMAN FILER From the Assumptions to the Problem Solution 9 he LINEARIY of the system state equation the system observation equation he GAUSSIAN nature of the initial state, x the system white noise, w the measurement white noise, v p(x Z is Gaussian Uniquely characterized by the conditional mean xˆ ( E[x Z = ] the conditional covariance P( cov[x ;x Z = ] p(x Z ~ Ν(xˆ(,P(

20 M.Isabel Ribeiro - June. DISCREE KALMAN FILER owards the Solution As the conditional probability density functions are Gaussian, the Kalman filter only propagates the first two moments p(x p(x p(x Z E[x Z ] = xˆ( P( p(x Z. p(x Z E[x Z ] = xˆ(. P( E[x Z ] = xˆ( P(. p(x Z E[x Z ] = xˆ( P(...

21 M.Isabel Ribeiro - June. DISCREE KALMAN FILER owards the Solution We stated that the state estimate equals the conditional mean xˆ ( = E[x Z ] Why? p(x Z Why not the mode of? Why not the median of p(x Z? p(x Z mean = mode = median As is Gaussian

22 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics KF dynamics is recursive Z = {z,z,...,z } U = {u,u,...,u } p(x Z + Z = {Z,z } + U {U =,u } p(x Z + + Prediction cycle What can you say about x + before we mae the measurement z + p(x Z + Z = {z,z,...,z } U = {U,u } Filtering cycle How can we improve our information on x + after we mae the measurement z +

23 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics 3 p(x p(x Z p(x Z prediction p(x U filtering filtering prediction p(x Z p(x Z p(x + Z p(x + Z prediction filtering p(x Z + prediction p(x + Z + filtering

24 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics - Prediction cycle 4 Prediction cycle p(x p(x Z Z ~ Ν(xˆ(,P( assumed nown p(x Z +? Is Gaussian xˆ( + E(x Z = +? P( + = cov[x ;x Z + + ]? x+ = E[x + Ax + Bu + Gw Z ] A E[x Z = ] + BE[u Z ] + GE[w Z ] xˆ ( + = Axˆ( + Bu

25 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics - Prediction cycle 5 Prediction cycle P( + = cov[x ;x Z + + ] ~ x( ~ x( + = x+ xˆ( + + x( + xˆ( + = = A ~ x( + Gw A prediction error x + Bu + Gw (Axˆ( + Bu P( + = E[ ~ x( + ~ x( + Z ] cov[y;y] = E[(y y(y y ] P( + = A P( A + G Q G

26 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics - Filtering cycle 6 Filtering cycle p(x Z + Ν( xˆ( +,P( + z p(x Z ? º Passo Measurement prediction What can you say about z + before we mae the measurement z + p(z + Z p(c + x+ + v+ Z E[z + Z ] = ẑ( + = C+ xˆ( = + = + + cov[z ;z Z ] Pz ( C P( C + R + +

27 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics - Filtering cycle 7 Filtering cycle º Passo p(x + Z + E[x + Z ] = E[x + Z,z+ ] + Z ~ e {Z, z( + } São equivalentes do ponto de vista de infirmação contida + E[x Z ] E[x Z, ~ + = + z( + ] If x, y and z are jointly Gaussian and y and z are statistically independent E[x Required result y,z] = E[x y] + E[x z] m x

28 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Filter dynamics - Filtering cycle 8 Filtering cycle xˆ ( + + = xˆ( + + P( + C C P( C R (z C xˆ( K( + ẑ( + measurement prediction xˆ ( Kalman Gain + = xˆ( + + K( + (z+ C xˆ( ~ z( + P( + + = P( + P( + C C P( C R C P(

29 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Dynamics 9 Linear System x+ = Ax + Bu + Gw, z = Cx + v Discrete Kalman Filter prediction xˆ ( + = Axˆ( + Bu P( + = A P( A + GQG P( xˆ ( + + = xˆ( + + K( + (z + C + xˆ( = P( + P( + C C P( + C + R C P( K( + = P( + C C P( C R + + filtering Initial conditions ( x xˆ = P ( = P

30 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Properties 3 he Discrete KF is a time-varying linear system xˆ + + = (I K+ C + Axˆ + K+ z + + Bu even when the system is time-invariant and has stationary noise xˆ + + = (I K+ CAxˆ + K+ z + + Bu the Kalman gain is not constant Does the Kalman gain matrix converges to a constant matrix? In which conditions?

31 DISCREE KALMAN FILER Properties 3 he state estimate is a linear function of the measurements KF dyamics in terms of the filtering estimate xˆ + + = (I K+ C + Axˆ + K+ z + + Bu Φ xˆ = x Assuming null inputs for the sae of simplicity xˆ = Φxˆ + Kz xˆ = ΦΦ xˆ + ΦKz + Kz xˆ 3 3 = ΦΦΦ xˆ + ΦΦKz + ΦKz + K3z3 M.Isabel Ribeiro - June.

32 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Properties 3 Innovation process r + = z + C + xˆ( + xˆ ( + = E(x + Z z(+ carries information on x(+ that was not available on this new information is represented by r(+ - innovation process? Z Properties of the innovation process the innovations r( are orthogonal to z(i E[r(z (i] =, i =,,..., the innovations are uncorrelated/white noise E[r(r (i] =, i this test can be used to acess if the filter is operating correctly

33 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Properties 33 Covariance matrix of the innovation process S( + = C + + P(K K C + + R + K( + = P( C C P( C R+ K( + = P( + + C S +

34 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Properties 34 he Discrete KF provides an unbiased estimate of the state xˆ + + is an unbiased estimate of the state x(+, providing that the initial conditions are P ( = P xˆ ( = x Is this still true if the filter initial conditions are not the specified?

35 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Steady state Kalman Filter 35 ime invariant system and stationay white system and observation noise x+ = Ax + Gw, z = Cx + v E[w w ] Q = E[v v ] = R Filter dynamics xˆ ( + + = Axˆ( + + K( + (z + Cxˆ( + P( + = AP( -A AP( C [CP( C + R] CP( A + GQG Discrete Riccati Equation K(

36 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Steady state Kalman Filter 36 If Q is positive definite, (A,G Q is controllable, and (A,C is observable, then the steady state Kalman filter exists the limit exists lim P( + = P P is the unique, finite positive-semidefinite solution to the algebraic equation P = AP A AP C [CP C + R] CP - A + GQG P is independent of P provided that P the steady-state Kalman filter is assymptotically unbiased K = P - C [CP - C + R]

37 M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf 37 Let z be a Gaussian random vector of dimension n [ ] = P [] = m, E ( z m( z m E z P - covariance matrix - symetric, positive defined Probability density function p(z = ( π exp n detp ( z m P (z m n= n=

38 M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf 38 Locus of points where the fdp is greater or equal than a given threshold (z m P (z m K n= line segment n= ellipse and inner points n=3 3D ellipsoid and inner points n>3 hiperellipsoid and inner points If P= diag( σ, σ,!, σn the ellipsoid axis are aligned with the axis of the referencial where the vector z is defined n (z m (z m P (z m K i i σ K length of the ellipse semi-axis = σ i i= K i

39 M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf - Error elipsoid 39 P = σ σ Example n= P = σ σ σ σ

40 M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf -Error ellipsoid and axis orientation 4 Error ellipsoid ( z mz P (z mz K P=P - to distinct eigenvalues correspond orthogonal eigenvectors Assuming that P is diagonalizable P = D with D = diag( λ, λ,!, = I λn Error ellipoid (after coordinate transformation w = z (z m z (w m w D D (z m (w m w z K K At the new coordinate system, the ellipsoid axis are aligned with the axis of the new referencial

41 4 M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf -Error elipsis and referencial axis n= [ ] K m y m x m y m x y x y x y x σ σ K m (y K m (x y y x x σ + σ m x m y K σ x K σ y ellipse = y x z

42 M.Isabel Ribeiro - June. MEANING OF HE COVARIANCE MARIX Generals on Gaussian pdf -Error ellipse and referencial axis 4 n= x z = y λ λ = = σ x = ρσ x σ y ρσ xσy σ y σ x ρσxσy ρσxσ y σ y x y P [ x y] K [ ] σ x + σ y + ( σ x σ y + 4ρ σ x σ y [ ] σ + σ ( σ σ + 4ρ σ σ x y x y x y y α = tan w Kλ ρσ σ σ + x y x σ y w Kλ, w K λ α K λ w x π 4 α π, 4 σ x σ y

43 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Probabilistic interpretation of the error ellipsoid 43 p(x Z ~ Ν(xˆ(,P( Given xˆ ( and P( it is possible to define the locus where, with a given probability, the values of the random vector x( ly. Hiperellipsoid with center in xˆ( and with semi-axis proportional to the eigenvalues of P(

44 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Probabilistic interpretation of the error ellipsoid 44 p(x Z ~ Ν(xˆ(,P( Example for n= M = {x : [x xˆ( ] P( [x xˆ( ] K} x xˆ( Pr{ x M} is a function of K a pre-specified values of this probability can be obtained by an apropriate choice of K

45 M.Isabel Ribeiro - June. DISCREE KALMAN FILER Probabilistic interpretation of the error ellipsoid 45 x R n p(x Z [ x xˆ( ] P( [x ~ Ν(xˆ(,P( xˆ( ] K (Scalar random variable with a χ distribution with n degrees of reedom How to chose K for a desired probability? Just consult a Chi square distribution table Probability = 9% n= K=.7 n= K=4.6 Probability = 95% n= K=3.84 n= K=5.99

46 M.Isabel Ribeiro - June. DISCREE KALMAN FILER he error ellipsoid and the filter dynamics 46 Prediction cycle x xˆ( P( xˆ ( + P( + u x + Q w x + = A x + Bu + G w xˆ ( + = A xˆ( + Bu P( + = A P( A + G QG

47 M.Isabel Ribeiro - June. DISCREE KALMAN FILER he error ellipsoid and the filter dynamics 47 Filtering cycle P( + xˆ ( + + = xˆ( + + K( + r( + P( + + = P( + K( + C + P( + x + xˆ ( + xˆ ( + + x + C + xˆ( + S( + P( + + r + R v + r + z + z + = C + x + + v + S( r + = z + C + xˆ( + + = C + P(K + K C + + R +

48 M.Isabel Ribeiro - June. Extended Kalman Filter 48 Non linear dynamics White Gaussian system and observation noise x + z = = f h (x (x,u + + v w x ~ N(x,P E[w w j ] = Q δ j E[v v j ] = R δ j QUESION: Which is the MMSE (minimum mean-square error estimate of x(+? Conditional mean xˆ ( + = E(x + Z Due to the non-linearity of the system, p(x are non Gaussian Z p(x + Z?

49 Extended Kalman Filter 49 (Optimal ANSWER: he MMSE estimate is given by a non-linear filter, that propagates the conditonal pdf. he EKF gives an approximation of the optimal estimate he non-linearities are approximated by a linearized version of the non-linear model around the last state estimate. For this approximation to be valid, this linearization should be a good approximation of the non-linear model in all the unceratinty domain associated with the state estimate. M.Isabel Ribeiro - June.

50 M.Isabel Ribeiro - June. Extended Kalman Filter 5 p(x Z xˆ( linearize x + = f (x,u + w Apply KF to the linear dynamics around xˆ( p(x Z + xˆ ( + Apply KF to the linear dynamics + = + linearize z h+ (x+ v + around xˆ ( + p(x Z + + xˆ ( + +

51 M.Isabel Ribeiro - June. Extended Kalman Filter 5 linearize around x = f (x,u + + xˆ( w f (x,u f (xˆ,u + f (x xˆ +... x+ f x + w + (f (xˆ,u f xˆ Prediction cycle of KF nown input xˆ + = f xˆ + (f (xˆ,u f xˆ P ( + = f P( f + Q

52 M.Isabel Ribeiro - June. Extended Kalman Filter 5 linearize z+ = h+ (x+ + v+ around xˆ ( + h + (x+ h+ (xˆ + + h+ (x+ xˆ z+ h+ x+ + v + (h+ (xˆ + h+ xˆ + Update cycle of KF nown input xˆ xˆ P( h ( h P( h R + + = [z h (xˆ ] P( + + = P( + P( + h [ h P( h + + R+ ] h P( + +

53 M.Isabel Ribeiro - June. References 53 Anderson, Moore, Optimal Filtering, Prentice-Hall, 979. M. Athans, Dynamic Stochastic Estimation, Prediction and Smoothing, Series of Lectures, Spring 999. E. W. Kamen, J. K. Su, Introduction to Optimal Estimation, Springer, 999. Peter S. Maybec, he Kalman Filter: an Introduction to Concepts Jerry M. Mendel, Lessons in Digital Estimation heory, Prentice-Hall, 987. M.Isabel Ribeiro, Notas Dispersas sobre Filtragem de Kalman CAPS, IS, June 989 (http://www.isr.ist.utl.pt/~mir

Kalman and Extended Kalman Filters: Concept, Derivation and Properties

Kalman and Extended Kalman Filters: Concept, Derivation and Properties Kalman and Extended Kalman ilters: Concept, Derivation and roperties Maria Isabel Ribeiro Institute for Systems and Robotics Instituto Superior Técnico Av. Rovisco ais, 1 1049-001 Lisboa ORTUGAL {mir@isr.ist.utl.pt}

More information

Understanding and Applying Kalman Filtering

Understanding and Applying Kalman Filtering Understanding and Applying Kalman Filtering Lindsay Kleeman Department of Electrical and Computer Systems Engineering Monash University, Clayton 1 Introduction Objectives: 1. Provide a basic understanding

More information

Sensorless Control of a Brushless DC motor using an Extended Kalman estimator.

Sensorless Control of a Brushless DC motor using an Extended Kalman estimator. Sensorless Control of a Brushless DC motor using an Extended Kalman estimator. Paul Kettle, Aengus Murray & Finbarr Moynihan. Analog Devices, Motion Control Group Wilmington, MA 1887,USA. Paul.Kettle@analog.com

More information

Kristine L. Bell and Harry L. Van Trees. Center of Excellence in C 3 I George Mason University Fairfax, VA 22030-4444, USA kbell@gmu.edu, hlv@gmu.

Kristine L. Bell and Harry L. Van Trees. Center of Excellence in C 3 I George Mason University Fairfax, VA 22030-4444, USA kbell@gmu.edu, hlv@gmu. POSERIOR CRAMÉR-RAO BOUND FOR RACKING ARGE BEARING Kristine L. Bell and Harry L. Van rees Center of Excellence in C 3 I George Mason University Fairfax, VA 22030-4444, USA bell@gmu.edu, hlv@gmu.edu ABSRAC

More information

Kalman Filter Applied to a Active Queue Management Problem

Kalman Filter Applied to a Active Queue Management Problem IOSR Journal of Electrical and Electronics Engineering (IOSR-JEEE) e-issn: 2278-1676,p-ISSN: 2320-3331, Volume 9, Issue 4 Ver. III (Jul Aug. 2014), PP 23-27 Jyoti Pandey 1 and Prof. Aashih Hiradhar 2 Department

More information

Linear-Quadratic Optimal Controller 10.3 Optimal Linear Control Systems

Linear-Quadratic Optimal Controller 10.3 Optimal Linear Control Systems Linear-Quadratic Optimal Controller 10.3 Optimal Linear Control Systems In Chapters 8 and 9 of this book we have designed dynamic controllers such that the closed-loop systems display the desired transient

More information

15.062 Data Mining: Algorithms and Applications Matrix Math Review

15.062 Data Mining: Algorithms and Applications Matrix Math Review .6 Data Mining: Algorithms and Applications Matrix Math Review The purpose of this document is to give a brief review of selected linear algebra concepts that will be useful for the course and to develop

More information

Linear Threshold Units

Linear Threshold Units Linear Threshold Units w x hx (... w n x n w We assume that each feature x j and each weight w j is a real number (we will relax this later) We will study three different algorithms for learning linear

More information

Multivariate Normal Distribution

Multivariate Normal Distribution Multivariate Normal Distribution Lecture 4 July 21, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2 Lecture #4-7/21/2011 Slide 1 of 41 Last Time Matrices and vectors Eigenvalues

More information

Master s Theory Exam Spring 2006

Master s Theory Exam Spring 2006 Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem

More information

A Multi-Model Filter for Mobile Terminal Location Tracking

A Multi-Model Filter for Mobile Terminal Location Tracking A Multi-Model Filter for Mobile Terminal Location Tracking M. McGuire, K.N. Plataniotis The Edward S. Rogers Sr. Department of Electrical and Computer Engineering, University of Toronto, 1 King s College

More information

Notes for STA 437/1005 Methods for Multivariate Data

Notes for STA 437/1005 Methods for Multivariate Data Notes for STA 437/1005 Methods for Multivariate Data Radford M. Neal, 26 November 2010 Random Vectors Notation: Let X be a random vector with p elements, so that X = [X 1,..., X p ], where denotes transpose.

More information

CS229 Lecture notes. Andrew Ng

CS229 Lecture notes. Andrew Ng CS229 Lecture notes Andrew Ng Part X Factor analysis Whenwehavedatax (i) R n thatcomesfromamixtureofseveral Gaussians, the EM algorithm can be applied to fit a mixture model. In this setting, we usually

More information

Discrete Frobenius-Perron Tracking

Discrete Frobenius-Perron Tracking Discrete Frobenius-Perron Tracing Barend J. van Wy and Michaël A. van Wy French South-African Technical Institute in Electronics at the Tshwane University of Technology Staatsartillerie Road, Pretoria,

More information

Advanced Signal Processing and Digital Noise Reduction

Advanced Signal Processing and Digital Noise Reduction Advanced Signal Processing and Digital Noise Reduction Saeed V. Vaseghi Queen's University of Belfast UK WILEY HTEUBNER A Partnership between John Wiley & Sons and B. G. Teubner Publishers Chichester New

More information

An Introduction to the Kalman Filter

An Introduction to the Kalman Filter An Introduction to the Kalman Filter Greg Welch 1 and Gary Bishop 2 TR 95041 Department of Computer Science University of North Carolina at Chapel Hill Chapel Hill, NC 275993175 Updated: Monday, July 24,

More information

Christfried Webers. Canberra February June 2015

Christfried Webers. Canberra February June 2015 c Statistical Group and College of Engineering and Computer Science Canberra February June (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 829 c Part VIII Linear Classification 2 Logistic

More information

Master s thesis tutorial: part III

Master s thesis tutorial: part III for the Autonomous Compliant Research group Tinne De Laet, Wilm Decré, Diederik Verscheure Katholieke Universiteit Leuven, Department of Mechanical Engineering, PMA Division 30 oktober 2006 Outline General

More information

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Review Jeopardy. Blue vs. Orange. Review Jeopardy Review Jeopardy Blue vs. Orange Review Jeopardy Jeopardy Round Lectures 0-3 Jeopardy Round $200 How could I measure how far apart (i.e. how different) two observations, y 1 and y 2, are from each other?

More information

Least-Squares Intersection of Lines

Least-Squares Intersection of Lines Least-Squares Intersection of Lines Johannes Traa - UIUC 2013 This write-up derives the least-squares solution for the intersection of lines. In the general case, a set of lines will not intersect at a

More information

NOV - 30211/II. 1. Let f(z) = sin z, z C. Then f(z) : 3. Let the sequence {a n } be given. (A) is bounded in the complex plane

NOV - 30211/II. 1. Let f(z) = sin z, z C. Then f(z) : 3. Let the sequence {a n } be given. (A) is bounded in the complex plane Mathematical Sciences Paper II Time Allowed : 75 Minutes] [Maximum Marks : 100 Note : This Paper contains Fifty (50) multiple choice questions. Each question carries Two () marks. Attempt All questions.

More information

Machine Learning and Pattern Recognition Logistic Regression

Machine Learning and Pattern Recognition Logistic Regression Machine Learning and Pattern Recognition Logistic Regression Course Lecturer:Amos J Storkey Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh Crichton Street,

More information

A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking

A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking 174 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 50, NO. 2, FEBRUARY 2002 A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking M. Sanjeev Arulampalam, Simon Maskell, Neil

More information

State Space Time Series Analysis

State Space Time Series Analysis State Space Time Series Analysis p. 1 State Space Time Series Analysis Siem Jan Koopman http://staff.feweb.vu.nl/koopman Department of Econometrics VU University Amsterdam Tinbergen Institute 2011 State

More information

KALMAN Filtering techniques can be used either for

KALMAN Filtering techniques can be used either for , July 6-8,, London, U.K. Fusion of Non-Contacting Sensors and Vital Parameter Extraction Using Kalman Filtering Jérôme Foussier, Daniel Teichmann, Jing Jia, Steffen Leonhar Abstract This paper describes

More information

Visual Vehicle Tracking Using An Improved EKF*

Visual Vehicle Tracking Using An Improved EKF* ACCV: he 5th Asian Conference on Computer Vision, 3--5 January, Melbourne, Australia Visual Vehicle racing Using An Improved EKF* Jianguang Lou, ao Yang, Weiming u, ieniu an National Laboratory of Pattern

More information

Analysis of Mean-Square Error and Transient Speed of the LMS Adaptive Algorithm

Analysis of Mean-Square Error and Transient Speed of the LMS Adaptive Algorithm IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 7, JULY 2002 1873 Analysis of Mean-Square Error Transient Speed of the LMS Adaptive Algorithm Onkar Dabeer, Student Member, IEEE, Elias Masry, Fellow,

More information

Probability and Random Variables. Generation of random variables (r.v.)

Probability and Random Variables. Generation of random variables (r.v.) Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly

More information

Factor analysis. Angela Montanari

Factor analysis. Angela Montanari Factor analysis Angela Montanari 1 Introduction Factor analysis is a statistical model that allows to explain the correlations between a large number of observed correlated variables through a small number

More information

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Maren Bennewitz, Diego Tipaldi, Luciano Spinello 1 Motivation Recall: Discrete filter Discretize

More information

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique A Reliability Point and Kalman Filter-based Vehicle Tracing Technique Soo Siang Teoh and Thomas Bräunl Abstract This paper introduces a technique for tracing the movement of vehicles in consecutive video

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Time series and stochastic processes Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 2012 Alonso and García-Martos

More information

Correlation in Random Variables

Correlation in Random Variables Correlation in Random Variables Lecture 11 Spring 2002 Correlation in Random Variables Suppose that an experiment produces two random variables, X and Y. What can we say about the relationship between

More information

Sections 2.11 and 5.8

Sections 2.11 and 5.8 Sections 211 and 58 Timothy Hanson Department of Statistics, University of South Carolina Stat 704: Data Analysis I 1/25 Gesell data Let X be the age in in months a child speaks his/her first word and

More information

10.3 POWER METHOD FOR APPROXIMATING EIGENVALUES

10.3 POWER METHOD FOR APPROXIMATING EIGENVALUES 58 CHAPTER NUMERICAL METHODS. POWER METHOD FOR APPROXIMATING EIGENVALUES In Chapter 7 you saw that the eigenvalues of an n n matrix A are obtained by solving its characteristic equation n c nn c nn...

More information

Mathieu St-Pierre. Denis Gingras Dr. Ing.

Mathieu St-Pierre. Denis Gingras Dr. Ing. Comparison between the unscented Kalman filter and the extended Kalman filter for the position estimation module of an integrated navigation information system Mathieu St-Pierre Electrical engineering

More information

Introduction to General and Generalized Linear Models

Introduction to General and Generalized Linear Models Introduction to General and Generalized Linear Models General Linear Models - part I Henrik Madsen Poul Thyregod Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby

More information

A Statistical Framework for Operational Infrasound Monitoring

A Statistical Framework for Operational Infrasound Monitoring A Statistical Framework for Operational Infrasound Monitoring Stephen J. Arrowsmith Rod W. Whitaker LA-UR 11-03040 The views expressed here do not necessarily reflect the views of the United States Government,

More information

CCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York

CCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York BME I5100: Biomedical Signal Processing Linear Discrimination Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not

More information

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices MATH 30 Differential Equations Spring 006 Linear algebra and the geometry of quadratic equations Similarity transformations and orthogonal matrices First, some things to recall from linear algebra Two

More information

LECTURE 4. Last time: Lecture outline

LECTURE 4. Last time: Lecture outline LECTURE 4 Last time: Types of convergence Weak Law of Large Numbers Strong Law of Large Numbers Asymptotic Equipartition Property Lecture outline Stochastic processes Markov chains Entropy rate Random

More information

Signal processing tools for largescale network monitoring: a state of the art

Signal processing tools for largescale network monitoring: a state of the art Signal processing tools for largescale network monitoring: a state of the art Kavé Salamatian Université de Savoie Outline Introduction & Methodologies Anomaly detection basic Statistical anomaly detection

More information

EE 570: Location and Navigation

EE 570: Location and Navigation EE 570: Location and Navigation On-Line Bayesian Tracking Aly El-Osery 1 Stephen Bruder 2 1 Electrical Engineering Department, New Mexico Tech Socorro, New Mexico, USA 2 Electrical and Computer Engineering

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION Introduction In the previous chapter, we explored a class of regression models having particularly simple analytical

More information

Time Series Analysis III

Time Series Analysis III Lecture 12: Time Series Analysis III MIT 18.S096 Dr. Kempthorne Fall 2013 MIT 18.S096 Time Series Analysis III 1 Outline Time Series Analysis III 1 Time Series Analysis III MIT 18.S096 Time Series Analysis

More information

Econometrics Simple Linear Regression

Econometrics Simple Linear Regression Econometrics Simple Linear Regression Burcu Eke UC3M Linear equations with one variable Recall what a linear equation is: y = b 0 + b 1 x is a linear equation with one variable, or equivalently, a straight

More information

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued). MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors Jordan canonical form (continued) Jordan canonical form A Jordan block is a square matrix of the form λ 1 0 0 0 0 λ 1 0 0 0 0 λ 0 0 J = 0

More information

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber 2011 1 Data Modeling & Analysis Techniques Probability & Statistics Manfred Huber 2011 1 Probability and Statistics Probability and statistics are often used interchangeably but are different, related fields

More information

Forecasting "High" and "Low" of financial time series by Particle systems and Kalman filters

Forecasting High and Low of financial time series by Particle systems and Kalman filters Forecasting "High" and "Low" of financial time series by Particle systems and Kalman filters S. DABLEMONT, S. VAN BELLEGEM, M. VERLEYSEN Université catholique de Louvain, Machine Learning Group, DICE 3,

More information

Recent Results on Approximations to Optimal Alarm Systems for Anomaly Detection

Recent Results on Approximations to Optimal Alarm Systems for Anomaly Detection Recent Results on Approximations to Optimal Alarm Systems for Anomaly Detection Rodney A. Martin NASA Ames Research Center Mail Stop 269-1 Moffett Field, CA 94035-1000, USA (650) 604-1334 Rodney.Martin@nasa.gov

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,

More information

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components The eigenvalues and eigenvectors of a square matrix play a key role in some important operations in statistics. In particular, they

More information

Lecture 3: Linear methods for classification

Lecture 3: Linear methods for classification Lecture 3: Linear methods for classification Rafael A. Irizarry and Hector Corrada Bravo February, 2010 Today we describe four specific algorithms useful for classification problems: linear regression,

More information

Adaptive Demand-Forecasting Approach based on Principal Components Time-series an application of data-mining technique to detection of market movement

Adaptive Demand-Forecasting Approach based on Principal Components Time-series an application of data-mining technique to detection of market movement Adaptive Demand-Forecasting Approach based on Principal Components Time-series an application of data-mining technique to detection of market movement Toshio Sugihara Abstract In this study, an adaptive

More information

Robotics. Chapter 25. Chapter 25 1

Robotics. Chapter 25. Chapter 25 1 Robotics Chapter 25 Chapter 25 1 Outline Robots, Effectors, and Sensors Localization and Mapping Motion Planning Motor Control Chapter 25 2 Mobile Robots Chapter 25 3 Manipulators P R R R R R Configuration

More information

A Movement Tracking Management Model with Kalman Filtering Global Optimization Techniques and Mahalanobis Distance

A Movement Tracking Management Model with Kalman Filtering Global Optimization Techniques and Mahalanobis Distance Loutraki, 21 26 October 2005 A Movement Tracking Management Model with ing Global Optimization Techniques and Raquel Ramos Pinho, João Manuel R. S. Tavares, Miguel Velhote Correia Laboratório de Óptica

More information

AN EMPIRICAL LIKELIHOOD METHOD FOR DATA AIDED CHANNEL IDENTIFICATION IN UNKNOWN NOISE FIELD

AN EMPIRICAL LIKELIHOOD METHOD FOR DATA AIDED CHANNEL IDENTIFICATION IN UNKNOWN NOISE FIELD AN EMPIRICAL LIELIHOOD METHOD FOR DATA AIDED CHANNEL IDENTIFICATION IN UNNOWN NOISE FIELD Frédéric Pascal 1, Jean-Pierre Barbot 1, Hugo Harari-ermadec, Ricardo Suyama 3, and Pascal Larzabal 1 1 SATIE,

More information

Adaptive Variable Step Size in LMS Algorithm Using Evolutionary Programming: VSSLMSEV

Adaptive Variable Step Size in LMS Algorithm Using Evolutionary Programming: VSSLMSEV Adaptive Variable Step Size in LMS Algorithm Using Evolutionary Programming: VSSLMSEV Ajjaiah H.B.M Research scholar Jyothi institute of Technology Bangalore, 560006, India Prabhakar V Hunagund Dept.of

More information

A SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS

A SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS A SURVEY ON CONTINUOUS ELLIPTICAL VECTOR DISTRIBUTIONS Eusebio GÓMEZ, Miguel A. GÓMEZ-VILLEGAS and J. Miguel MARÍN Abstract In this paper it is taken up a revision and characterization of the class of

More information

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop Music and Machine Learning (IFT6080 Winter 08) Prof. Douglas Eck, Université de Montréal These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher

More information

MIMO CHANNEL CAPACITY

MIMO CHANNEL CAPACITY MIMO CHANNEL CAPACITY Ochi Laboratory Nguyen Dang Khoa (D1) 1 Contents Introduction Review of information theory Fixed MIMO channel Fading MIMO channel Summary and Conclusions 2 1. Introduction The use

More information

Lecture 8: Signal Detection and Noise Assumption

Lecture 8: Signal Detection and Noise Assumption ECE 83 Fall Statistical Signal Processing instructor: R. Nowak, scribe: Feng Ju Lecture 8: Signal Detection and Noise Assumption Signal Detection : X = W H : X = S + W where W N(, σ I n n and S = [s, s,...,

More information

Formulations of Model Predictive Control. Dipartimento di Elettronica e Informazione

Formulations of Model Predictive Control. Dipartimento di Elettronica e Informazione Formulations of Model Predictive Control Riccardo Scattolini Riccardo Scattolini Dipartimento di Elettronica e Informazione Impulse and step response models 2 At the beginning of the 80, the early formulations

More information

Part 4 fitting with energy loss and multiple scattering non gaussian uncertainties outliers

Part 4 fitting with energy loss and multiple scattering non gaussian uncertainties outliers Part 4 fitting with energy loss and multiple scattering non gaussian uncertainties outliers material intersections to treat material effects in track fit, locate material 'intersections' along particle

More information

Similarity and Diagonalization. Similar Matrices

Similarity and Diagonalization. Similar Matrices MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that

More information

2.2 Creaseness operator

2.2 Creaseness operator 2.2. Creaseness operator 31 2.2 Creaseness operator Antonio López, a member of our group, has studied for his PhD dissertation the differential operators described in this section [72]. He has compared

More information

Stability of the LMS Adaptive Filter by Means of a State Equation

Stability of the LMS Adaptive Filter by Means of a State Equation Stability of the LMS Adaptive Filter by Means of a State Equation Vítor H. Nascimento and Ali H. Sayed Electrical Engineering Department University of California Los Angeles, CA 90095 Abstract This work

More information

Covariance and Correlation. Consider the joint probability distribution f XY (x, y).

Covariance and Correlation. Consider the joint probability distribution f XY (x, y). Chapter 5: JOINT PROBABILITY DISTRIBUTIONS Part 2: Section 5-2 Covariance and Correlation Consider the joint probability distribution f XY (x, y). Is there a relationship between X and Y? If so, what kind?

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 7, JULY 2005 2475. G. George Yin, Fellow, IEEE, and Vikram Krishnamurthy, Fellow, IEEE

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 7, JULY 2005 2475. G. George Yin, Fellow, IEEE, and Vikram Krishnamurthy, Fellow, IEEE IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 7, JULY 2005 2475 LMS Algorithms for Tracking Slow Markov Chains With Applications to Hidden Markov Estimation and Adaptive Multiuser Detection G.

More information

Course 8. An Introduction to the Kalman Filter

Course 8. An Introduction to the Kalman Filter Course 8 An Introduction to the Kalman Filter Speakers Greg Welch Gary Bishop Kalman Filters in 2 hours? Hah! No magic. Pretty simple to apply. Tolerant of abuse. Notes are a standalone reference. These

More information

Basics of Statistical Machine Learning

Basics of Statistical Machine Learning CS761 Spring 2013 Advanced Machine Learning Basics of Statistical Machine Learning Lecturer: Xiaojin Zhu jerryzhu@cs.wisc.edu Modern machine learning is rooted in statistics. You will find many familiar

More information

Subspace Analysis and Optimization for AAM Based Face Alignment

Subspace Analysis and Optimization for AAM Based Face Alignment Subspace Analysis and Optimization for AAM Based Face Alignment Ming Zhao Chun Chen College of Computer Science Zhejiang University Hangzhou, 310027, P.R.China zhaoming1999@zju.edu.cn Stan Z. Li Microsoft

More information

Kalman Filtering with Uncertain Noise Covariances

Kalman Filtering with Uncertain Noise Covariances Kalman Filtering with Uncertain Noise Covariances Sriiran Kosanam Dan Simon Department of Electrical Engineering Department of Electrical Engineering Cleveland State University Cleveland State University

More information

PTE505: Inverse Modeling for Subsurface Flow Data Integration (3 Units)

PTE505: Inverse Modeling for Subsurface Flow Data Integration (3 Units) PTE505: Inverse Modeling for Subsurface Flow Data Integration (3 Units) Instructor: Behnam Jafarpour, Mork Family Department of Chemical Engineering and Material Science Petroleum Engineering, HED 313,

More information

Hybrid Data and Decision Fusion Techniques for Model-Based Data Gathering in Wireless Sensor Networks

Hybrid Data and Decision Fusion Techniques for Model-Based Data Gathering in Wireless Sensor Networks Hybrid Data and Decision Fusion Techniques for Model-Based Data Gathering in Wireless Sensor Networks Lorenzo A. Rossi, Bhaskar Krishnamachari and C.-C. Jay Kuo Department of Electrical Engineering, University

More information

Univariate and Multivariate Methods PEARSON. Addison Wesley

Univariate and Multivariate Methods PEARSON. Addison Wesley Time Series Analysis Univariate and Multivariate Methods SECOND EDITION William W. S. Wei Department of Statistics The Fox School of Business and Management Temple University PEARSON Addison Wesley Boston

More information

Two Topics in Parametric Integration Applied to Stochastic Simulation in Industrial Engineering

Two Topics in Parametric Integration Applied to Stochastic Simulation in Industrial Engineering Two Topics in Parametric Integration Applied to Stochastic Simulation in Industrial Engineering Department of Industrial Engineering and Management Sciences Northwestern University September 15th, 2014

More information

Component Ordering in Independent Component Analysis Based on Data Power

Component Ordering in Independent Component Analysis Based on Data Power Component Ordering in Independent Component Analysis Based on Data Power Anne Hendrikse Raymond Veldhuis University of Twente University of Twente Fac. EEMCS, Signals and Systems Group Fac. EEMCS, Signals

More information

Lecture 9: Introduction to Pattern Analysis

Lecture 9: Introduction to Pattern Analysis Lecture 9: Introduction to Pattern Analysis g Features, patterns and classifiers g Components of a PR system g An example g Probability definitions g Bayes Theorem g Gaussian densities Features, patterns

More information

Uncertainty Quantification in Remaining Useful Life of Aerospace Components using State Space Models and Inverse FORM

Uncertainty Quantification in Remaining Useful Life of Aerospace Components using State Space Models and Inverse FORM Uncertainty Quantification in Remaining Useful Life of Aerospace Components using State Space Models and Inverse FORM Shankar Sankararaman SGT Inc., NASA Ames Research Center, Moffett Field, CA 94035,

More information

SYSM 6304: Risk and Decision Analysis Lecture 3 Monte Carlo Simulation

SYSM 6304: Risk and Decision Analysis Lecture 3 Monte Carlo Simulation SYSM 6304: Risk and Decision Analysis Lecture 3 Monte Carlo Simulation M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: M.Vidyasagar@utdallas.edu September 19, 2015 Outline

More information

Vehicle Tracking in Occlusion and Clutter

Vehicle Tracking in Occlusion and Clutter Vehicle Tracking in Occlusion and Clutter by KURTIS NORMAN MCBRIDE A thesis presented to the University of Waterloo in fulfilment of the thesis requirement for the degree of Master of Applied Science in

More information

Dynamic data processing

Dynamic data processing Dynamic data processing recursive least-squares P.J.G. Teunissen Series on Mathematical Geodesy and Positioning Dynamic data processing recursive least-squares Dynamic data processing recursive least-squares

More information

Deterministic Sampling-based Switching Kalman Filtering for Vehicle Tracking

Deterministic Sampling-based Switching Kalman Filtering for Vehicle Tracking Proceedings of the IEEE ITSC 2006 2006 IEEE Intelligent Transportation Systems Conference Toronto, Canada, September 17-20, 2006 WA4.1 Deterministic Sampling-based Switching Kalman Filtering for Vehicle

More information

Orthogonal Projections

Orthogonal Projections Orthogonal Projections and Reflections (with exercises) by D. Klain Version.. Corrections and comments are welcome! Orthogonal Projections Let X,..., X k be a family of linearly independent (column) vectors

More information

The Chinese Restaurant Process

The Chinese Restaurant Process COS 597C: Bayesian nonparametrics Lecturer: David Blei Lecture # 1 Scribes: Peter Frazier, Indraneel Mukherjee September 21, 2007 In this first lecture, we begin by introducing the Chinese Restaurant Process.

More information

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS There are four questions, each with several parts. 1. Customers Coming to an Automatic Teller Machine (ATM) (30 points)

More information

Real-time Visual Tracker by Stream Processing

Real-time Visual Tracker by Stream Processing Real-time Visual Tracker by Stream Processing Simultaneous and Fast 3D Tracking of Multiple Faces in Video Sequences by Using a Particle Filter Oscar Mateo Lozano & Kuzahiro Otsuka presented by Piotr Rudol

More information

Nonlinear Systems and Control Lecture # 15 Positive Real Transfer Functions & Connection with Lyapunov Stability. p. 1/?

Nonlinear Systems and Control Lecture # 15 Positive Real Transfer Functions & Connection with Lyapunov Stability. p. 1/? Nonlinear Systems and Control Lecture # 15 Positive Real Transfer Functions & Connection with Lyapunov Stability p. 1/? p. 2/? Definition: A p p proper rational transfer function matrix G(s) is positive

More information

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution

SF2940: Probability theory Lecture 8: Multivariate Normal Distribution SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2014 Timo Koski () Mathematisk statistik 24.09.2014 1 / 75 Learning outcomes Random vectors, mean vector, covariance

More information

Signal Detection C H A P T E R 14 14.1 SIGNAL DETECTION AS HYPOTHESIS TESTING

Signal Detection C H A P T E R 14 14.1 SIGNAL DETECTION AS HYPOTHESIS TESTING C H A P T E R 4 Signal Detection 4. SIGNAL DETECTION AS HYPOTHESIS TESTING In Chapter 3 we considered hypothesis testing in the context of random variables. The detector resulting in the minimum probability

More information

by the matrix A results in a vector which is a reflection of the given

by the matrix A results in a vector which is a reflection of the given Eigenvalues & Eigenvectors Example Suppose Then So, geometrically, multiplying a vector in by the matrix A results in a vector which is a reflection of the given vector about the y-axis We observe that

More information

Statistical Machine Learning

Statistical Machine Learning Statistical Machine Learning UoC Stats 37700, Winter quarter Lecture 4: classical linear and quadratic discriminants. 1 / 25 Linear separation For two classes in R d : simple idea: separate the classes

More information

Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not.

Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not. Statistical Learning: Chapter 4 Classification 4.1 Introduction Supervised learning with a categorical (Qualitative) response Notation: - Feature vector X, - qualitative response Y, taking values in C

More information

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written

More information

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211) An-Najah National University Faculty of Engineering Industrial Engineering Department Course : Quantitative Methods (65211) Instructor: Eng. Tamer Haddad 2 nd Semester 2009/2010 Chapter 5 Example: Joint

More information

Logistic Regression (1/24/13)

Logistic Regression (1/24/13) STA63/CBB540: Statistical methods in computational biology Logistic Regression (/24/3) Lecturer: Barbara Engelhardt Scribe: Dinesh Manandhar Introduction Logistic regression is model for regression used

More information

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh Peter Richtárik Week 3 Randomized Coordinate Descent With Arbitrary Sampling January 27, 2016 1 / 30 The Problem

More information

67 Detection: Determining the Number of Sources

67 Detection: Determining the Number of Sources Williams, D.B. Detection: Determining the Number of Sources Digital Signal Processing Handbook Ed. Vijay K. Madisetti and Douglas B. Williams Boca Raton: CRC Press LLC, 999 c 999byCRCPressLLC 67 Detection:

More information

LONG TERM FOREIGN CURRENCY EXCHANGE RATE PREDICTIONS

LONG TERM FOREIGN CURRENCY EXCHANGE RATE PREDICTIONS LONG TERM FOREIGN CURRENCY EXCHANGE RATE PREDICTIONS The motivation of this work is to predict foreign currency exchange rates between countries using the long term economic performance of the respective

More information