EE 570: Location and Navigation
|
|
|
- Ethelbert Harris
- 9 years ago
- Views:
Transcription
1 EE 570: Location and Navigation On-Line Bayesian Tracking Aly El-Osery 1 Stephen Bruder 2 1 Electrical Engineering Department, New Mexico Tech Socorro, New Mexico, USA 2 Electrical and Computer Engineering Department, Embry-Riddle Aeronautical Univesity Prescott, Arizona, USA March 27, 2014 Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
2 Objective Sequentially estimate on-line the states of a system as it changes over time using observations that are corrupted with noise. Filtering: the time of the estimate coincides with the last measurement. Smoothing: the time of the estimate is within the span of the measurements. Prediction: the time of the estimate occurs after the last available measurement. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
3 Given State-Space Equations x k = f k ( x k 1, w k 1 ) (1) z k = h k ( x k, v k ) (2) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
4 Given State-Space Equations (n 1) state vector at time k x k = f k ( x k 1, w k 1 ) (1) z k = h k ( x k, v k ) (2) (m 1) measurement vector at time k Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
5 Given State-Space Equations Possibly non-linear function, f k : R n R nw R n x k = f k ( x k 1, w k 1 ) (1) z k = h k ( x k, v k ) (2) Possibly non-linear function, h k : R m R nv R m Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
6 Given State-Space Equations i.i.d state noise x k = f k ( x k 1, w k 1 ) (1) i.i.d measurement noise z k = h k ( x k, v k ) (2) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
7 Given State-Space Equations x k = f k ( x k 1, w k 1 ) (1) z k = h k ( x k, v k ) (2) The state process is Markov chain, i.e., p( x k x 1,..., x k 1 ) = p( x k x k 1 ) and the distribution of z k conditional on the state x k is independent of previous state and measurement values, i.e., p( z k x 1:k, z 1:k 1 ) = p( z k x k ) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
8 Objective Probabilistically estimate x k using previous measurement z 1:k. In other words, construct the pdf p( x k z 1:k ). Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
9 Objective Probabilistically estimate x k using previous measurement z 1:k. In other words, construct the pdf p( x k z 1:k ). Optimal MMSE Estimate E{ x k ˆ x k 2 z 1:k } = x k ˆ x k 2 p( x k z 1:k )d x k (3) in other words find the conditional mean ˆ x k = E{ x k z 1:k } = x k p( x k z 1:k )d x k (4) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
10 Assumptions w k and v k are drawn from a Gaussian distribution, uncorrelated have zero mean and statistically independent. { E{ w k w T Q k i = k i } = (5) 0 i k { E{ v k v T R k i = k i } = (6) 0 i k { E{ w k v T i } = 0 i, k (7) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
11 Assumptions f k and h k are both linear, e.g., the state-space system equations may be written as x k = Φ k 1 x k 1 + w k 1 (8) y k = H k x k + v k (9) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
12 Assumptions f k and h k are both linear, e.g., the state-space system equations may be written as x k = Φ k 1 x k 1 + w k 1 (8) y k = H k x k + v k (9) (n n) transition matrix relating x k 1 to x k Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
13 Assumptions f k and h k are both linear, e.g., the state-space system equations may be written as x k = Φ k 1 x k 1 + w k 1 (8) y k = H k x k + v k (9) (m n) matrix provides noiseless connection between measurement and state vectors Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
14 State-Space Equations ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 (10) P k k 1 = Q k 1 +Φ k 1 P k 1 k 1 Φ T k 1 (11) ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) (12) P k k = (I K k H k )P k k 1 (13) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
15 State-Space Equations ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 (10) P k k 1 = Q k 1 +Φ k 1 P k 1 k 1 Φ T k 1 (11) ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) (12) P k k = (I K k H k )P k k 1 (13) (n m) Kalman gain Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
16 State-Space Equations ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 (10) P k k 1 = Q k 1 +Φ k 1 P k 1 k 1 Φ T k 1 (11) ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) (12) P k k = (I K k H k )P k k 1 (13) Measurement innovation Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
17 Kalman Gain K k = P k k 1 H T k ( H kp k k 1 H T k + R k ) 1 (14) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
18 Kalman Gain K k = P k k 1 H T k ( H kp k k 1 H T k + R k ) 1 (14) Covariance of the innovation term Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
19 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
20 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Compute Kalman gain K k = P k k 1 H T k (H k P k k 1 HT k + R k ) 1 Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
21 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Compute Kalman gain K k = P k k 1 H T k (H k P k k 1 HT k + R k ) 1 Update estimate with measurement z k ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
22 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Compute Kalman gain K k = P k k 1 H T k (H k P k k 1 HT k + R k ) 1 Update estimate with measurement z k ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) Update error covariance P k k = P k k 1 K k H k P k k 1 Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
23 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Compute Kalman gain K k = P k k 1 H T k (H k P k k 1 HT k + R k ) 1 Project ahead ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 P k k 1 = Q k 1 + Φ k 1 P k 1 k 1 Φ T k 1 Update estimate with measurement z k ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) k = k + 1 Update error covariance P k k = P k k 1 K k H k P k k 1 Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
24 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Compute Kalman gain K k = P k k 1 H T k (H k P k k 1 HT k + R k ) 1 Project ahead ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 P k k 1 = Q k 1 + Φ k 1 P k 1 k 1 Φ T k 1 Update estimate with measurement z k ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) k = k + 1 Update error covariance P k k = P k k 1 K k H k P k k 1 Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
25 Sequential Processing If R is a block matrix, i.e., R = diag(r 1, R 2,...,R r ). The R i has dimensions p i p i. Then, we can sequentially process the measurements as: For i = 1, 2,...,r K i = P i 1 (H i ) T (H i P i 1 (H i ) T + R i ) 1 (15) ˆ x i k k = ˆ x i k k + Ki ( z i k H iˆ x i 1 k k ) (16) P i = (I K i H i )P i 1 (17) where ˆ x 0 k k = ˆ x k k 1, P 0 = P 0 k k 1 and Hi is p i n corresponding to the rows of H corresponding the measurement being processed. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
26 Observability The system is observable if the observability matrix H(k n+1) H(k n 2)Φ(k n+1) O(k) =. H(k)Φ(k 1)...Φ(k n+1) (18) where n is the number of states, has a rank of n. The rank of O is a binary indicator and does not provide a measure of how close the system is to being unobservable, hence, is prone to numerical ill-conditioning. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
27 A Better Observability Measure In addition to the computation of the rank of O(k), compute the Singular Value Decomposition (SVD) of O(k) as O = UΣV (19) and observe the diagonal values of the matrix Σ. Using this approach it is possible to monitor the variations in the system observability due to changes in system dynamics. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
28 Remarks Kalman filter is optimal under the aforementioned assumptions, and it is also an unbiased and minimum variance estimate. If the Gaussian assumptions is not true, Kalman filter is biased and not minimum variance. Observability is dynamics dependent. The error covariance update may be implemented using the Joseph form which provides a more stable solution due to the guaranteed symmetry. P k k = (I K k H k )P k k 1 (I K k H k ) T + K k R k K T k (20) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
29 System Model To obtain the state vector estimate ˆ x(t) x(t) = F(t) x(t)+g(t) w(t) (21) E{ x(t)} = tˆ x(t) = F(t)ˆ x(t) (22) Solving the above equation over the interval t τ s, t ˆ x(t) = e ( t t τs F(t )dt )ˆ x(t τs ) (23) where F k 1 is the average of F at times t and t τ s. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
30 System Model Discretization As shown in the Kalman filter equations the state vector estimate is given by ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 Therefore, Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
31 System Model Discretization As shown in the Kalman filter equations the state vector estimate is given by ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 Therefore, Φ k 1 = e F k 1τ s I+F k 1 τ s (24) where F k 1 is the average of F at times t and t τ s, and first order approximation is used. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
32 Discrete Covariance Matrix Q k Assuming white noise, small time step, G is constant over the integration period, and the trapezoidal integration Q k [ ] Φ k 1 G k 1 Q(t k 1 )G T k 1 ΦT k 1 + G k 1Q(t k 1 )G T k 1 τ s (25) where E{ w(η) w T (ζ)} = Q(η)δ(η ζ) (26) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
33 Linearized System F k = f( x) x, x=ˆ xk k 1 H k = h( x) x (27) x=ˆ xk k 1 where f( x) x = f 1 f 1 x 1 f 2 f 2 x 1. x 2 x 2,.... h( x) x = h 1 h 1 x 1 h 2 h 2 x 1. x 2 x 2 (28).... Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
34 First Order Markov Noise State Equation ḃ(t) = 1 T c b(t)+w(t) (29) Autocorrelation Function E{b(t)b(t +τ)} = σ 2 BI e τ /Tc (30) where E{w(t)w(t +τ)} = Q(t)δ(t τ) (31) and T c is the correlation time. Q(t) = 2σ2 BI T c (32) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
35 Discrete First Order Markov Noise State Equation System Covariance Matrix b k = e 1 Tc τs b k 1 + w k 1 (33) Q = σ 2 BI [1 e 2 Tc τs ] (34) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
36 Autocorrelation of 1st order Markov R b (τ) = σ 2 BI e τ /Tc σ 2 BI σ 2 e τ T c Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
37 Small Correlation Time T c = 0.01 R b (τ) τ Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
38 Small Correlation Time T c = R b (τ) 1 b(t) 0-1 τ Time (sec) Measured Actual Estimated Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
39 Larger Correlation Time T c = 0.1 R n (τ) τ Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
40 Larger Correlation Time T c = R n (τ) 1 b(t) 0-1 τ Time (sec) Measured Actual Estimated Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
41 Unscented Kalman Filter (UKF) Propagates carefully chosen sample points (using unscented transformation) through the true non-linear system, and therefore captures the posterior mean and covariance accurately to the second order. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
42 Particle Filter A Monte Carlo based method. It allows for a complete representation of the state distribution function. Unlike EKF and UKF, particle filters do not require the Gaussian assumptions. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
43 Bayesian Filtering: From Kalman Filters to Particle Filters, and Beyond, by Zhe Chen Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25
Master s thesis tutorial: part III
for the Autonomous Compliant Research group Tinne De Laet, Wilm Decré, Diederik Verscheure Katholieke Universiteit Leuven, Department of Mechanical Engineering, PMA Division 30 oktober 2006 Outline General
PTE505: Inverse Modeling for Subsurface Flow Data Integration (3 Units)
PTE505: Inverse Modeling for Subsurface Flow Data Integration (3 Units) Instructor: Behnam Jafarpour, Mork Family Department of Chemical Engineering and Material Science Petroleum Engineering, HED 313,
Discrete Frobenius-Perron Tracking
Discrete Frobenius-Perron Tracing Barend J. van Wy and Michaël A. van Wy French South-African Technical Institute in Electronics at the Tshwane University of Technology Staatsartillerie Road, Pretoria,
Lecture 5 Least-squares
EE263 Autumn 2007-08 Stephen Boyd Lecture 5 Least-squares least-squares (approximate) solution of overdetermined equations projection and orthogonality principle least-squares estimation BLUE property
Probability and Random Variables. Generation of random variables (r.v.)
Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly
A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking
174 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 50, NO. 2, FEBRUARY 2002 A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking M. Sanjeev Arulampalam, Simon Maskell, Neil
Mathieu St-Pierre. Denis Gingras Dr. Ing.
Comparison between the unscented Kalman filter and the extended Kalman filter for the position estimation module of an integrated navigation information system Mathieu St-Pierre Electrical engineering
System Identification for Acoustic Comms.:
System Identification for Acoustic Comms.: New Insights and Approaches for Tracking Sparse and Rapidly Fluctuating Channels Weichang Li and James Preisig Woods Hole Oceanographic Institution The demodulation
Linear regression methods for large n and streaming data
Linear regression methods for large n and streaming data Large n and small or moderate p is a fairly simple problem. The sufficient statistic for β in OLS (and ridge) is: The concept of sufficiency is
Least Squares Estimation
Least Squares Estimation SARA A VAN DE GEER Volume 2, pp 1041 1045 in Encyclopedia of Statistics in Behavioral Science ISBN-13: 978-0-470-86080-9 ISBN-10: 0-470-86080-4 Editors Brian S Everitt & David
Kalman Filter Applied to a Active Queue Management Problem
IOSR Journal of Electrical and Electronics Engineering (IOSR-JEEE) e-issn: 2278-1676,p-ISSN: 2320-3331, Volume 9, Issue 4 Ver. III (Jul Aug. 2014), PP 23-27 Jyoti Pandey 1 and Prof. Aashih Hiradhar 2 Department
Understanding and Applying Kalman Filtering
Understanding and Applying Kalman Filtering Lindsay Kleeman Department of Electrical and Computer Systems Engineering Monash University, Clayton 1 Introduction Objectives: 1. Provide a basic understanding
Content. Professur für Steuerung, Regelung und Systemdynamik. Lecture: Vehicle Dynamics Tutor: T. Wey Date: 01.01.08, 20:11:52
1 Content Overview 1. Basics on Signal Analysis 2. System Theory 3. Vehicle Dynamics Modeling 4. Active Chassis Control Systems 5. Signals & Systems 6. Statistical System Analysis 7. Filtering 8. Modeling,
Master s Theory Exam Spring 2006
Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem
Course 8. An Introduction to the Kalman Filter
Course 8 An Introduction to the Kalman Filter Speakers Greg Welch Gary Bishop Kalman Filters in 2 hours? Hah! No magic. Pretty simple to apply. Tolerant of abuse. Notes are a standalone reference. These
Spatial Statistics Chapter 3 Basics of areal data and areal data modeling
Spatial Statistics Chapter 3 Basics of areal data and areal data modeling Recall areal data also known as lattice data are data Y (s), s D where D is a discrete index set. This usually corresponds to data
Automated Stellar Classification for Large Surveys with EKF and RBF Neural Networks
Chin. J. Astron. Astrophys. Vol. 5 (2005), No. 2, 203 210 (http:/www.chjaa.org) Chinese Journal of Astronomy and Astrophysics Automated Stellar Classification for Large Surveys with EKF and RBF Neural
An Introduction to the Kalman Filter
An Introduction to the Kalman Filter Greg Welch 1 and Gary Bishop 2 TR 95041 Department of Computer Science University of North Carolina at Chapel Hill Chapel Hill, NC 275993175 Updated: Monday, July 24,
STA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! [email protected]! http://www.cs.toronto.edu/~rsalakhu/ Lecture 6 Three Approaches to Classification Construct
Dynamic data processing
Dynamic data processing recursive least-squares P.J.G. Teunissen Series on Mathematical Geodesy and Positioning Dynamic data processing recursive least-squares Dynamic data processing recursive least-squares
Variance Reduction. Pricing American Options. Monte Carlo Option Pricing. Delta and Common Random Numbers
Variance Reduction The statistical efficiency of Monte Carlo simulation can be measured by the variance of its output If this variance can be lowered without changing the expected value, fewer replications
Advanced Signal Processing and Digital Noise Reduction
Advanced Signal Processing and Digital Noise Reduction Saeed V. Vaseghi Queen's University of Belfast UK WILEY HTEUBNER A Partnership between John Wiley & Sons and B. G. Teubner Publishers Chichester New
Statistics Graduate Courses
Statistics Graduate Courses STAT 7002--Topics in Statistics-Biological/Physical/Mathematics (cr.arr.).organized study of selected topics. Subjects and earnable credit may vary from semester to semester.
A Regime-Switching Model for Electricity Spot Prices. Gero Schindlmayr EnBW Trading GmbH [email protected]
A Regime-Switching Model for Electricity Spot Prices Gero Schindlmayr EnBW Trading GmbH [email protected] May 31, 25 A Regime-Switching Model for Electricity Spot Prices Abstract Electricity markets
11. Time series and dynamic linear models
11. Time series and dynamic linear models Objective To introduce the Bayesian approach to the modeling and forecasting of time series. Recommended reading West, M. and Harrison, J. (1997). models, (2 nd
Statistical Tests for Multiple Forecast Comparison
Statistical Tests for Multiple Forecast Comparison Roberto S. Mariano (Singapore Management University & University of Pennsylvania) Daniel Preve (Uppsala University) June 6-7, 2008 T.W. Anderson Conference,
CCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York
BME I5100: Biomedical Signal Processing Linear Discrimination Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not
Probabilistic Models for Big Data. Alex Davies and Roger Frigola University of Cambridge 13th February 2014
Probabilistic Models for Big Data Alex Davies and Roger Frigola University of Cambridge 13th February 2014 The State of Big Data Why probabilistic models for Big Data? 1. If you don t have to worry about
3. Regression & Exponential Smoothing
3. Regression & Exponential Smoothing 3.1 Forecasting a Single Time Series Two main approaches are traditionally used to model a single time series z 1, z 2,..., z n 1. Models the observation z t as a
Mapping an Application to a Control Architecture: Specification of the Problem
Mapping an Application to a Control Architecture: Specification of the Problem Mieczyslaw M. Kokar 1, Kevin M. Passino 2, Kenneth Baclawski 1, and Jeffrey E. Smith 3 1 Northeastern University, Boston,
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Maren Bennewitz, Diego Tipaldi, Luciano Spinello 1 Motivation Recall: Discrete filter Discretize
How To Solve A Sequential Mca Problem
Monte Carlo-based statistical methods (MASM11/FMS091) Jimmy Olsson Centre for Mathematical Sciences Lund University, Sweden Lecture 6 Sequential Monte Carlo methods II February 3, 2012 Changes in HA1 Problem
Tracking in flussi video 3D. Ing. Samuele Salti
Seminari XXIII ciclo Tracking in flussi video 3D Ing. Tutors: Prof. Tullio Salmon Cinotti Prof. Luigi Di Stefano The Tracking problem Detection Object model, Track initiation, Track termination, Tracking
Classification Problems
Classification Read Chapter 4 in the text by Bishop, except omit Sections 4.1.6, 4.1.7, 4.2.4, 4.3.3, 4.3.5, 4.3.6, 4.4, and 4.5. Also, review sections 1.5.1, 1.5.2, 1.5.3, and 1.5.4. Classification Problems
Deterministic Sampling-based Switching Kalman Filtering for Vehicle Tracking
Proceedings of the IEEE ITSC 2006 2006 IEEE Intelligent Transportation Systems Conference Toronto, Canada, September 17-20, 2006 WA4.1 Deterministic Sampling-based Switching Kalman Filtering for Vehicle
Linear Classification. Volker Tresp Summer 2015
Linear Classification Volker Tresp Summer 2015 1 Classification Classification is the central task of pattern recognition Sensors supply information about an object: to which class do the object belong
Forecasting "High" and "Low" of financial time series by Particle systems and Kalman filters
Forecasting "High" and "Low" of financial time series by Particle systems and Kalman filters S. DABLEMONT, S. VAN BELLEGEM, M. VERLEYSEN Université catholique de Louvain, Machine Learning Group, DICE 3,
Cell Phone based Activity Detection using Markov Logic Network
Cell Phone based Activity Detection using Markov Logic Network Somdeb Sarkhel [email protected] 1 Introduction Mobile devices are becoming increasingly sophisticated and the latest generation of smart
Nonlinear Iterative Partial Least Squares Method
Numerical Methods for Determining Principal Component Analysis Abstract Factors Béchu, S., Richard-Plouet, M., Fernandez, V., Walton, J., and Fairley, N. (2016) Developments in numerical treatments for
Lecture 5: Singular Value Decomposition SVD (1)
EEM3L1: Numerical and Analytical Techniques Lecture 5: Singular Value Decomposition SVD (1) EE3L1, slide 1, Version 4: 25-Sep-02 Motivation for SVD (1) SVD = Singular Value Decomposition Consider the system
CHAPTER IV - BROWNIAN MOTION
CHAPTER IV - BROWNIAN MOTION JOSEPH G. CONLON 1. Construction of Brownian Motion There are two ways in which the idea of a Markov chain on a discrete state space can be generalized: (1) The discrete time
Forecasting in supply chains
1 Forecasting in supply chains Role of demand forecasting Effective transportation system or supply chain design is predicated on the availability of accurate inputs to the modeling process. One of the
Monte Carlo-based statistical methods (MASM11/FMS091)
Monte Carlo-based statistical methods (MASM11/FMS091) Magnus Wiktorsson Centre for Mathematical Sciences Lund University, Sweden Lecture 6 Sequential Monte Carlo methods II February 7, 2014 M. Wiktorsson
Web-based Supplementary Materials for Bayesian Effect Estimation. Accounting for Adjustment Uncertainty by Chi Wang, Giovanni
1 Web-based Supplementary Materials for Bayesian Effect Estimation Accounting for Adjustment Uncertainty by Chi Wang, Giovanni Parmigiani, and Francesca Dominici In Web Appendix A, we provide detailed
Kristine L. Bell and Harry L. Van Trees. Center of Excellence in C 3 I George Mason University Fairfax, VA 22030-4444, USA [email protected], hlv@gmu.
POSERIOR CRAMÉR-RAO BOUND FOR RACKING ARGE BEARING Kristine L. Bell and Harry L. Van rees Center of Excellence in C 3 I George Mason University Fairfax, VA 22030-4444, USA [email protected], [email protected] ABSRAC
Joint parameter and state estimation algorithms for real-time traffic monitoring
USDOT Region V Regional University Transportation Center Final Report NEXTRANS Project No 097IY04. Joint parameter and state estimation algorithms for real-time traffic monitoring By Ren Wang PhD Candidate
Stock Option Pricing Using Bayes Filters
Stock Option Pricing Using Bayes Filters Lin Liao [email protected] Abstract When using Black-Scholes formula to price options, the key is the estimation of the stochastic return variance. In this
MIMO CHANNEL CAPACITY
MIMO CHANNEL CAPACITY Ochi Laboratory Nguyen Dang Khoa (D1) 1 Contents Introduction Review of information theory Fixed MIMO channel Fading MIMO channel Summary and Conclusions 2 1. Introduction The use
Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091)
Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091) Magnus Wiktorsson Centre for Mathematical Sciences Lund University, Sweden Lecture 5 Sequential Monte Carlo methods I February
Component Ordering in Independent Component Analysis Based on Data Power
Component Ordering in Independent Component Analysis Based on Data Power Anne Hendrikse Raymond Veldhuis University of Twente University of Twente Fac. EEMCS, Signals and Systems Group Fac. EEMCS, Signals
Linear Threshold Units
Linear Threshold Units w x hx (... w n x n w We assume that each feature x j and each weight w j is a real number (we will relax this later) We will study three different algorithms for learning linear
Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not.
Statistical Learning: Chapter 4 Classification 4.1 Introduction Supervised learning with a categorical (Qualitative) response Notation: - Feature vector X, - qualitative response Y, taking values in C
Accurate and robust image superresolution by neural processing of local image representations
Accurate and robust image superresolution by neural processing of local image representations Carlos Miravet 1,2 and Francisco B. Rodríguez 1 1 Grupo de Neurocomputación Biológica (GNB), Escuela Politécnica
Bayesian Machine Learning (ML): Modeling And Inference in Big Data. Zhuhua Cai Google, Rice University [email protected]
Bayesian Machine Learning (ML): Modeling And Inference in Big Data Zhuhua Cai Google Rice University [email protected] 1 Syllabus Bayesian ML Concepts (Today) Bayesian ML on MapReduce (Next morning) Bayesian
Univariate and Multivariate Methods PEARSON. Addison Wesley
Time Series Analysis Univariate and Multivariate Methods SECOND EDITION William W. S. Wei Department of Statistics The Fox School of Business and Management Temple University PEARSON Addison Wesley Boston
10-601. Machine Learning. http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html
10-601 Machine Learning http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html Course data All up-to-date info is on the course web page: http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html
Communication on the Grassmann Manifold: A Geometric Approach to the Noncoherent Multiple-Antenna Channel
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 2, FEBRUARY 2002 359 Communication on the Grassmann Manifold: A Geometric Approach to the Noncoherent Multiple-Antenna Channel Lizhong Zheng, Student
Bayesian Statistics: Indian Buffet Process
Bayesian Statistics: Indian Buffet Process Ilker Yildirim Department of Brain and Cognitive Sciences University of Rochester Rochester, NY 14627 August 2012 Reference: Most of the material in this note
Unsupervised and supervised dimension reduction: Algorithms and connections
Unsupervised and supervised dimension reduction: Algorithms and connections Jieping Ye Department of Computer Science and Engineering Evolutionary Functional Genomics Center The Biodesign Institute Arizona
QUALITY ENGINEERING PROGRAM
QUALITY ENGINEERING PROGRAM Production engineering deals with the practical engineering problems that occur in manufacturing planning, manufacturing processes and in the integration of the facilities and
Monte Carlo-based statistical methods (MASM11/FMS091)
Monte Carlo-based statistical methods (MASM11/FMS091) Jimmy Olsson Centre for Mathematical Sciences Lund University, Sweden Lecture 5 Sequential Monte Carlo methods I February 5, 2013 J. Olsson Monte Carlo-based
Sales and operations planning (SOP) Demand forecasting
ing, introduction Sales and operations planning (SOP) forecasting To balance supply with demand and synchronize all operational plans Capture demand data forecasting Balancing of supply, demand, and budgets.
Regression Analysis. Regression Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013
Lecture 6: Regression Analysis MIT 18.S096 Dr. Kempthorne Fall 2013 MIT 18.S096 Regression Analysis 1 Outline Regression Analysis 1 Regression Analysis MIT 18.S096 Regression Analysis 2 Multiple Linear
Analyzing Structural Equation Models With Missing Data
Analyzing Structural Equation Models With Missing Data Craig Enders* Arizona State University [email protected] based on Enders, C. K. (006). Analyzing structural equation models with missing data. In G.
Linear Algebra Review. Vectors
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka [email protected] http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length
These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop
Music and Machine Learning (IFT6080 Winter 08) Prof. Douglas Eck, Université de Montréal These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher
Bayes and Naïve Bayes. cs534-machine Learning
Bayes and aïve Bayes cs534-machine Learning Bayes Classifier Generative model learns Prediction is made by and where This is often referred to as the Bayes Classifier, because of the use of the Bayes rule
Christfried Webers. Canberra February June 2015
c Statistical Group and College of Engineering and Computer Science Canberra February June (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 829 c Part VIII Linear Classification 2 Logistic
ANALYZING INVESTMENT RETURN OF ASSET PORTFOLIOS WITH MULTIVARIATE ORNSTEIN-UHLENBECK PROCESSES
ANALYZING INVESTMENT RETURN OF ASSET PORTFOLIOS WITH MULTIVARIATE ORNSTEIN-UHLENBECK PROCESSES by Xiaofeng Qian Doctor of Philosophy, Boston University, 27 Bachelor of Science, Peking University, 2 a Project
MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).
MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors Jordan canonical form (continued) Jordan canonical form A Jordan block is a square matrix of the form λ 1 0 0 0 0 λ 1 0 0 0 0 λ 0 0 J = 0
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written
Lecture 8: Signal Detection and Noise Assumption
ECE 83 Fall Statistical Signal Processing instructor: R. Nowak, scribe: Feng Ju Lecture 8: Signal Detection and Noise Assumption Signal Detection : X = W H : X = S + W where W N(, σ I n n and S = [s, s,...,
4.3 Least Squares Approximations
18 Chapter. Orthogonality.3 Least Squares Approximations It often happens that Ax D b has no solution. The usual reason is: too many equations. The matrix has more rows than columns. There are more equations
The Monte Carlo Framework, Examples from Finance and Generating Correlated Random Variables
Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh The Monte Carlo Framework, Examples from Finance and Generating Correlated Random Variables 1 The Monte Carlo Framework Suppose we wish
Statistical machine learning, high dimension and big data
Statistical machine learning, high dimension and big data S. Gaïffas 1 14 mars 2014 1 CMAP - Ecole Polytechnique Agenda for today Divide and Conquer principle for collaborative filtering Graphical modelling,
Gaussian Conjugate Prior Cheat Sheet
Gaussian Conjugate Prior Cheat Sheet Tom SF Haines 1 Purpose This document contains notes on how to handle the multivariate Gaussian 1 in a Bayesian setting. It focuses on the conjugate prior, its Bayesian
MCMC Using Hamiltonian Dynamics
5 MCMC Using Hamiltonian Dynamics Radford M. Neal 5.1 Introduction Markov chain Monte Carlo (MCMC) originated with the classic paper of Metropolis et al. (1953), where it was used to simulate the distribution
CS229 Lecture notes. Andrew Ng
CS229 Lecture notes Andrew Ng Part X Factor analysis Whenwehavedatax (i) R n thatcomesfromamixtureofseveral Gaussians, the EM algorithm can be applied to fit a mixture model. In this setting, we usually
13 MATH FACTS 101. 2 a = 1. 7. The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.
3 MATH FACTS 0 3 MATH FACTS 3. Vectors 3.. Definition We use the overhead arrow to denote a column vector, i.e., a linear segment with a direction. For example, in three-space, we write a vector in terms
Dealing with large datasets
Dealing with large datasets (by throwing away most of the data) Alan Heavens Institute for Astronomy, University of Edinburgh with Ben Panter, Rob Tweedie, Mark Bastin, Will Hossack, Keith McKellar, Trevor
BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES
BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES 123 CHAPTER 7 BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES 7.1 Introduction Even though using SVM presents
Weighted-Least-Square(WLS) State Estimation
Weighted-Least-Square(WLS) State Estimation Yousu Chen PNNL December 18, 2015 This document is a description of how to formulate the weighted-least squares (WLS) state estimation problem. Most of the formulation
19 LINEAR QUADRATIC REGULATOR
19 LINEAR QUADRATIC REGULATOR 19.1 Introduction The simple form of loopshaping in scalar systems does not extend directly to multivariable (MIMO) plants, which are characterized by transfer matrices instead
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION Introduction In the previous chapter, we explored a class of regression models having particularly simple analytical
Imputing Missing Data using SAS
ABSTRACT Paper 3295-2015 Imputing Missing Data using SAS Christopher Yim, California Polytechnic State University, San Luis Obispo Missing data is an unfortunate reality of statistics. However, there are
