EE 570: Location and Navigation

Size: px
Start display at page:

Download "EE 570: Location and Navigation"

Transcription

1 EE 570: Location and Navigation On-Line Bayesian Tracking Aly El-Osery 1 Stephen Bruder 2 1 Electrical Engineering Department, New Mexico Tech Socorro, New Mexico, USA 2 Electrical and Computer Engineering Department, Embry-Riddle Aeronautical Univesity Prescott, Arizona, USA March 27, 2014 Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

2 Objective Sequentially estimate on-line the states of a system as it changes over time using observations that are corrupted with noise. Filtering: the time of the estimate coincides with the last measurement. Smoothing: the time of the estimate is within the span of the measurements. Prediction: the time of the estimate occurs after the last available measurement. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

3 Given State-Space Equations x k = f k ( x k 1, w k 1 ) (1) z k = h k ( x k, v k ) (2) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

4 Given State-Space Equations (n 1) state vector at time k x k = f k ( x k 1, w k 1 ) (1) z k = h k ( x k, v k ) (2) (m 1) measurement vector at time k Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

5 Given State-Space Equations Possibly non-linear function, f k : R n R nw R n x k = f k ( x k 1, w k 1 ) (1) z k = h k ( x k, v k ) (2) Possibly non-linear function, h k : R m R nv R m Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

6 Given State-Space Equations i.i.d state noise x k = f k ( x k 1, w k 1 ) (1) i.i.d measurement noise z k = h k ( x k, v k ) (2) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

7 Given State-Space Equations x k = f k ( x k 1, w k 1 ) (1) z k = h k ( x k, v k ) (2) The state process is Markov chain, i.e., p( x k x 1,..., x k 1 ) = p( x k x k 1 ) and the distribution of z k conditional on the state x k is independent of previous state and measurement values, i.e., p( z k x 1:k, z 1:k 1 ) = p( z k x k ) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

8 Objective Probabilistically estimate x k using previous measurement z 1:k. In other words, construct the pdf p( x k z 1:k ). Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

9 Objective Probabilistically estimate x k using previous measurement z 1:k. In other words, construct the pdf p( x k z 1:k ). Optimal MMSE Estimate E{ x k ˆ x k 2 z 1:k } = x k ˆ x k 2 p( x k z 1:k )d x k (3) in other words find the conditional mean ˆ x k = E{ x k z 1:k } = x k p( x k z 1:k )d x k (4) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

10 Assumptions w k and v k are drawn from a Gaussian distribution, uncorrelated have zero mean and statistically independent. { E{ w k w T Q k i = k i } = (5) 0 i k { E{ v k v T R k i = k i } = (6) 0 i k { E{ w k v T i } = 0 i, k (7) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

11 Assumptions f k and h k are both linear, e.g., the state-space system equations may be written as x k = Φ k 1 x k 1 + w k 1 (8) y k = H k x k + v k (9) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

12 Assumptions f k and h k are both linear, e.g., the state-space system equations may be written as x k = Φ k 1 x k 1 + w k 1 (8) y k = H k x k + v k (9) (n n) transition matrix relating x k 1 to x k Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

13 Assumptions f k and h k are both linear, e.g., the state-space system equations may be written as x k = Φ k 1 x k 1 + w k 1 (8) y k = H k x k + v k (9) (m n) matrix provides noiseless connection between measurement and state vectors Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

14 State-Space Equations ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 (10) P k k 1 = Q k 1 +Φ k 1 P k 1 k 1 Φ T k 1 (11) ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) (12) P k k = (I K k H k )P k k 1 (13) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

15 State-Space Equations ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 (10) P k k 1 = Q k 1 +Φ k 1 P k 1 k 1 Φ T k 1 (11) ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) (12) P k k = (I K k H k )P k k 1 (13) (n m) Kalman gain Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

16 State-Space Equations ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 (10) P k k 1 = Q k 1 +Φ k 1 P k 1 k 1 Φ T k 1 (11) ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) (12) P k k = (I K k H k )P k k 1 (13) Measurement innovation Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

17 Kalman Gain K k = P k k 1 H T k ( H kp k k 1 H T k + R k ) 1 (14) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

18 Kalman Gain K k = P k k 1 H T k ( H kp k k 1 H T k + R k ) 1 (14) Covariance of the innovation term Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

19 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

20 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Compute Kalman gain K k = P k k 1 H T k (H k P k k 1 HT k + R k ) 1 Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

21 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Compute Kalman gain K k = P k k 1 H T k (H k P k k 1 HT k + R k ) 1 Update estimate with measurement z k ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

22 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Compute Kalman gain K k = P k k 1 H T k (H k P k k 1 HT k + R k ) 1 Update estimate with measurement z k ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) Update error covariance P k k = P k k 1 K k H k P k k 1 Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

23 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Compute Kalman gain K k = P k k 1 H T k (H k P k k 1 HT k + R k ) 1 Project ahead ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 P k k 1 = Q k 1 + Φ k 1 P k 1 k 1 Φ T k 1 Update estimate with measurement z k ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) k = k + 1 Update error covariance P k k = P k k 1 K k H k P k k 1 Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

24 Kalman filter data flow Initial estimate (ˆ x 0 and P 0 ) Compute Kalman gain K k = P k k 1 H T k (H k P k k 1 HT k + R k ) 1 Project ahead ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 P k k 1 = Q k 1 + Φ k 1 P k 1 k 1 Φ T k 1 Update estimate with measurement z k ˆ x k k = ˆ x k k 1 + K k ( z k H kˆ xk k 1 ) k = k + 1 Update error covariance P k k = P k k 1 K k H k P k k 1 Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

25 Sequential Processing If R is a block matrix, i.e., R = diag(r 1, R 2,...,R r ). The R i has dimensions p i p i. Then, we can sequentially process the measurements as: For i = 1, 2,...,r K i = P i 1 (H i ) T (H i P i 1 (H i ) T + R i ) 1 (15) ˆ x i k k = ˆ x i k k + Ki ( z i k H iˆ x i 1 k k ) (16) P i = (I K i H i )P i 1 (17) where ˆ x 0 k k = ˆ x k k 1, P 0 = P 0 k k 1 and Hi is p i n corresponding to the rows of H corresponding the measurement being processed. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

26 Observability The system is observable if the observability matrix H(k n+1) H(k n 2)Φ(k n+1) O(k) =. H(k)Φ(k 1)...Φ(k n+1) (18) where n is the number of states, has a rank of n. The rank of O is a binary indicator and does not provide a measure of how close the system is to being unobservable, hence, is prone to numerical ill-conditioning. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

27 A Better Observability Measure In addition to the computation of the rank of O(k), compute the Singular Value Decomposition (SVD) of O(k) as O = UΣV (19) and observe the diagonal values of the matrix Σ. Using this approach it is possible to monitor the variations in the system observability due to changes in system dynamics. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

28 Remarks Kalman filter is optimal under the aforementioned assumptions, and it is also an unbiased and minimum variance estimate. If the Gaussian assumptions is not true, Kalman filter is biased and not minimum variance. Observability is dynamics dependent. The error covariance update may be implemented using the Joseph form which provides a more stable solution due to the guaranteed symmetry. P k k = (I K k H k )P k k 1 (I K k H k ) T + K k R k K T k (20) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

29 System Model To obtain the state vector estimate ˆ x(t) x(t) = F(t) x(t)+g(t) w(t) (21) E{ x(t)} = tˆ x(t) = F(t)ˆ x(t) (22) Solving the above equation over the interval t τ s, t ˆ x(t) = e ( t t τs F(t )dt )ˆ x(t τs ) (23) where F k 1 is the average of F at times t and t τ s. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

30 System Model Discretization As shown in the Kalman filter equations the state vector estimate is given by ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 Therefore, Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

31 System Model Discretization As shown in the Kalman filter equations the state vector estimate is given by ˆ x k k 1 = Φ k 1ˆ xk 1 k 1 Therefore, Φ k 1 = e F k 1τ s I+F k 1 τ s (24) where F k 1 is the average of F at times t and t τ s, and first order approximation is used. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

32 Discrete Covariance Matrix Q k Assuming white noise, small time step, G is constant over the integration period, and the trapezoidal integration Q k [ ] Φ k 1 G k 1 Q(t k 1 )G T k 1 ΦT k 1 + G k 1Q(t k 1 )G T k 1 τ s (25) where E{ w(η) w T (ζ)} = Q(η)δ(η ζ) (26) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

33 Linearized System F k = f( x) x, x=ˆ xk k 1 H k = h( x) x (27) x=ˆ xk k 1 where f( x) x = f 1 f 1 x 1 f 2 f 2 x 1. x 2 x 2,.... h( x) x = h 1 h 1 x 1 h 2 h 2 x 1. x 2 x 2 (28).... Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

34 First Order Markov Noise State Equation ḃ(t) = 1 T c b(t)+w(t) (29) Autocorrelation Function E{b(t)b(t +τ)} = σ 2 BI e τ /Tc (30) where E{w(t)w(t +τ)} = Q(t)δ(t τ) (31) and T c is the correlation time. Q(t) = 2σ2 BI T c (32) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

35 Discrete First Order Markov Noise State Equation System Covariance Matrix b k = e 1 Tc τs b k 1 + w k 1 (33) Q = σ 2 BI [1 e 2 Tc τs ] (34) Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

36 Autocorrelation of 1st order Markov R b (τ) = σ 2 BI e τ /Tc σ 2 BI σ 2 e τ T c Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

37 Small Correlation Time T c = 0.01 R b (τ) τ Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

38 Small Correlation Time T c = R b (τ) 1 b(t) 0-1 τ Time (sec) Measured Actual Estimated Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

39 Larger Correlation Time T c = 0.1 R n (τ) τ Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

40 Larger Correlation Time T c = R n (τ) 1 b(t) 0-1 τ Time (sec) Measured Actual Estimated Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

41 Unscented Kalman Filter (UKF) Propagates carefully chosen sample points (using unscented transformation) through the true non-linear system, and therefore captures the posterior mean and covariance accurately to the second order. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

42 Particle Filter A Monte Carlo based method. It allows for a complete representation of the state distribution function. Unlike EKF and UKF, particle filters do not require the Gaussian assumptions. Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

43 Bayesian Filtering: From Kalman Filters to Particle Filters, and Beyond, by Zhe Chen Aly El-Osery, Stephen Bruder (NMT,ERAU) EE 570: Location and Navigation March 27, / 25

Master s thesis tutorial: part III

Master s thesis tutorial: part III for the Autonomous Compliant Research group Tinne De Laet, Wilm Decré, Diederik Verscheure Katholieke Universiteit Leuven, Department of Mechanical Engineering, PMA Division 30 oktober 2006 Outline General

More information

PTE505: Inverse Modeling for Subsurface Flow Data Integration (3 Units)

PTE505: Inverse Modeling for Subsurface Flow Data Integration (3 Units) PTE505: Inverse Modeling for Subsurface Flow Data Integration (3 Units) Instructor: Behnam Jafarpour, Mork Family Department of Chemical Engineering and Material Science Petroleum Engineering, HED 313,

More information

Discrete Frobenius-Perron Tracking

Discrete Frobenius-Perron Tracking Discrete Frobenius-Perron Tracing Barend J. van Wy and Michaël A. van Wy French South-African Technical Institute in Electronics at the Tshwane University of Technology Staatsartillerie Road, Pretoria,

More information

Lecture 5 Least-squares

Lecture 5 Least-squares EE263 Autumn 2007-08 Stephen Boyd Lecture 5 Least-squares least-squares (approximate) solution of overdetermined equations projection and orthogonality principle least-squares estimation BLUE property

More information

Probability and Random Variables. Generation of random variables (r.v.)

Probability and Random Variables. Generation of random variables (r.v.) Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly

More information

A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking

A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking 174 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 50, NO. 2, FEBRUARY 2002 A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking M. Sanjeev Arulampalam, Simon Maskell, Neil

More information

Mathieu St-Pierre. Denis Gingras Dr. Ing.

Mathieu St-Pierre. Denis Gingras Dr. Ing. Comparison between the unscented Kalman filter and the extended Kalman filter for the position estimation module of an integrated navigation information system Mathieu St-Pierre Electrical engineering

More information

System Identification for Acoustic Comms.:

System Identification for Acoustic Comms.: System Identification for Acoustic Comms.: New Insights and Approaches for Tracking Sparse and Rapidly Fluctuating Channels Weichang Li and James Preisig Woods Hole Oceanographic Institution The demodulation

More information

Linear regression methods for large n and streaming data

Linear regression methods for large n and streaming data Linear regression methods for large n and streaming data Large n and small or moderate p is a fairly simple problem. The sufficient statistic for β in OLS (and ridge) is: The concept of sufficiency is

More information

Least Squares Estimation

Least Squares Estimation Least Squares Estimation SARA A VAN DE GEER Volume 2, pp 1041 1045 in Encyclopedia of Statistics in Behavioral Science ISBN-13: 978-0-470-86080-9 ISBN-10: 0-470-86080-4 Editors Brian S Everitt & David

More information

Kalman Filter Applied to a Active Queue Management Problem

Kalman Filter Applied to a Active Queue Management Problem IOSR Journal of Electrical and Electronics Engineering (IOSR-JEEE) e-issn: 2278-1676,p-ISSN: 2320-3331, Volume 9, Issue 4 Ver. III (Jul Aug. 2014), PP 23-27 Jyoti Pandey 1 and Prof. Aashih Hiradhar 2 Department

More information

Understanding and Applying Kalman Filtering

Understanding and Applying Kalman Filtering Understanding and Applying Kalman Filtering Lindsay Kleeman Department of Electrical and Computer Systems Engineering Monash University, Clayton 1 Introduction Objectives: 1. Provide a basic understanding

More information

Content. Professur für Steuerung, Regelung und Systemdynamik. Lecture: Vehicle Dynamics Tutor: T. Wey Date: 01.01.08, 20:11:52

Content. Professur für Steuerung, Regelung und Systemdynamik. Lecture: Vehicle Dynamics Tutor: T. Wey Date: 01.01.08, 20:11:52 1 Content Overview 1. Basics on Signal Analysis 2. System Theory 3. Vehicle Dynamics Modeling 4. Active Chassis Control Systems 5. Signals & Systems 6. Statistical System Analysis 7. Filtering 8. Modeling,

More information

Master s Theory Exam Spring 2006

Master s Theory Exam Spring 2006 Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem

More information

Course 8. An Introduction to the Kalman Filter

Course 8. An Introduction to the Kalman Filter Course 8 An Introduction to the Kalman Filter Speakers Greg Welch Gary Bishop Kalman Filters in 2 hours? Hah! No magic. Pretty simple to apply. Tolerant of abuse. Notes are a standalone reference. These

More information

Spatial Statistics Chapter 3 Basics of areal data and areal data modeling

Spatial Statistics Chapter 3 Basics of areal data and areal data modeling Spatial Statistics Chapter 3 Basics of areal data and areal data modeling Recall areal data also known as lattice data are data Y (s), s D where D is a discrete index set. This usually corresponds to data

More information

Automated Stellar Classification for Large Surveys with EKF and RBF Neural Networks

Automated Stellar Classification for Large Surveys with EKF and RBF Neural Networks Chin. J. Astron. Astrophys. Vol. 5 (2005), No. 2, 203 210 (http:/www.chjaa.org) Chinese Journal of Astronomy and Astrophysics Automated Stellar Classification for Large Surveys with EKF and RBF Neural

More information

An Introduction to the Kalman Filter

An Introduction to the Kalman Filter An Introduction to the Kalman Filter Greg Welch 1 and Gary Bishop 2 TR 95041 Department of Computer Science University of North Carolina at Chapel Hill Chapel Hill, NC 275993175 Updated: Monday, July 24,

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 6 Three Approaches to Classification Construct

More information

Dynamic data processing

Dynamic data processing Dynamic data processing recursive least-squares P.J.G. Teunissen Series on Mathematical Geodesy and Positioning Dynamic data processing recursive least-squares Dynamic data processing recursive least-squares

More information

Variance Reduction. Pricing American Options. Monte Carlo Option Pricing. Delta and Common Random Numbers

Variance Reduction. Pricing American Options. Monte Carlo Option Pricing. Delta and Common Random Numbers Variance Reduction The statistical efficiency of Monte Carlo simulation can be measured by the variance of its output If this variance can be lowered without changing the expected value, fewer replications

More information

Advanced Signal Processing and Digital Noise Reduction

Advanced Signal Processing and Digital Noise Reduction Advanced Signal Processing and Digital Noise Reduction Saeed V. Vaseghi Queen's University of Belfast UK WILEY HTEUBNER A Partnership between John Wiley & Sons and B. G. Teubner Publishers Chichester New

More information

Sales Rate and Cumulative Sales Forecasting Using Kalman Filtering Techniques

Sales Rate and Cumulative Sales Forecasting Using Kalman Filtering Techniques 1 Sales Rate and Cumulative Sales Forecasting Using Kalman Filtering Techniques Michael Munroe, Intel Corporation Ted Hench, Arizona State University, 480 614 1496, ted.hench@asu.edu Matthew Timmons, Arizona

More information

Statistics Graduate Courses

Statistics Graduate Courses Statistics Graduate Courses STAT 7002--Topics in Statistics-Biological/Physical/Mathematics (cr.arr.).organized study of selected topics. Subjects and earnable credit may vary from semester to semester.

More information

A Regime-Switching Model for Electricity Spot Prices. Gero Schindlmayr EnBW Trading GmbH g.schindlmayr@enbw.com

A Regime-Switching Model for Electricity Spot Prices. Gero Schindlmayr EnBW Trading GmbH g.schindlmayr@enbw.com A Regime-Switching Model for Electricity Spot Prices Gero Schindlmayr EnBW Trading GmbH g.schindlmayr@enbw.com May 31, 25 A Regime-Switching Model for Electricity Spot Prices Abstract Electricity markets

More information

11. Time series and dynamic linear models

11. Time series and dynamic linear models 11. Time series and dynamic linear models Objective To introduce the Bayesian approach to the modeling and forecasting of time series. Recommended reading West, M. and Harrison, J. (1997). models, (2 nd

More information

Statistical Tests for Multiple Forecast Comparison

Statistical Tests for Multiple Forecast Comparison Statistical Tests for Multiple Forecast Comparison Roberto S. Mariano (Singapore Management University & University of Pennsylvania) Daniel Preve (Uppsala University) June 6-7, 2008 T.W. Anderson Conference,

More information

CCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York

CCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York BME I5100: Biomedical Signal Processing Linear Discrimination Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not

More information

Probabilistic Models for Big Data. Alex Davies and Roger Frigola University of Cambridge 13th February 2014

Probabilistic Models for Big Data. Alex Davies and Roger Frigola University of Cambridge 13th February 2014 Probabilistic Models for Big Data Alex Davies and Roger Frigola University of Cambridge 13th February 2014 The State of Big Data Why probabilistic models for Big Data? 1. If you don t have to worry about

More information

3. Regression & Exponential Smoothing

3. Regression & Exponential Smoothing 3. Regression & Exponential Smoothing 3.1 Forecasting a Single Time Series Two main approaches are traditionally used to model a single time series z 1, z 2,..., z n 1. Models the observation z t as a

More information

Mapping an Application to a Control Architecture: Specification of the Problem

Mapping an Application to a Control Architecture: Specification of the Problem Mapping an Application to a Control Architecture: Specification of the Problem Mieczyslaw M. Kokar 1, Kevin M. Passino 2, Kenneth Baclawski 1, and Jeffrey E. Smith 3 1 Northeastern University, Boston,

More information

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Maren Bennewitz, Diego Tipaldi, Luciano Spinello 1 Motivation Recall: Discrete filter Discretize

More information

A Multi-Model Filter for Mobile Terminal Location Tracking

A Multi-Model Filter for Mobile Terminal Location Tracking A Multi-Model Filter for Mobile Terminal Location Tracking M. McGuire, K.N. Plataniotis The Edward S. Rogers Sr. Department of Electrical and Computer Engineering, University of Toronto, 1 King s College

More information

How To Solve A Sequential Mca Problem

How To Solve A Sequential Mca Problem Monte Carlo-based statistical methods (MASM11/FMS091) Jimmy Olsson Centre for Mathematical Sciences Lund University, Sweden Lecture 6 Sequential Monte Carlo methods II February 3, 2012 Changes in HA1 Problem

More information

Tracking in flussi video 3D. Ing. Samuele Salti

Tracking in flussi video 3D. Ing. Samuele Salti Seminari XXIII ciclo Tracking in flussi video 3D Ing. Tutors: Prof. Tullio Salmon Cinotti Prof. Luigi Di Stefano The Tracking problem Detection Object model, Track initiation, Track termination, Tracking

More information

Classification Problems

Classification Problems Classification Read Chapter 4 in the text by Bishop, except omit Sections 4.1.6, 4.1.7, 4.2.4, 4.3.3, 4.3.5, 4.3.6, 4.4, and 4.5. Also, review sections 1.5.1, 1.5.2, 1.5.3, and 1.5.4. Classification Problems

More information

Deterministic Sampling-based Switching Kalman Filtering for Vehicle Tracking

Deterministic Sampling-based Switching Kalman Filtering for Vehicle Tracking Proceedings of the IEEE ITSC 2006 2006 IEEE Intelligent Transportation Systems Conference Toronto, Canada, September 17-20, 2006 WA4.1 Deterministic Sampling-based Switching Kalman Filtering for Vehicle

More information

Linear Classification. Volker Tresp Summer 2015

Linear Classification. Volker Tresp Summer 2015 Linear Classification Volker Tresp Summer 2015 1 Classification Classification is the central task of pattern recognition Sensors supply information about an object: to which class do the object belong

More information

Forecasting "High" and "Low" of financial time series by Particle systems and Kalman filters

Forecasting High and Low of financial time series by Particle systems and Kalman filters Forecasting "High" and "Low" of financial time series by Particle systems and Kalman filters S. DABLEMONT, S. VAN BELLEGEM, M. VERLEYSEN Université catholique de Louvain, Machine Learning Group, DICE 3,

More information

Particle Filter-Based On-Line Estimation of Spot (Cross-)Volatility with Nonlinear Market Microstructure Noise Models. Jan C.

Particle Filter-Based On-Line Estimation of Spot (Cross-)Volatility with Nonlinear Market Microstructure Noise Models. Jan C. Particle Filter-Based On-Line Estimation of Spot (Cross-)Volatility with Nonlinear Market Microstructure Noise Models Jan C. Neddermeyer (joint work with Rainer Dahlhaus, University of Heidelberg) DZ BANK

More information

Cell Phone based Activity Detection using Markov Logic Network

Cell Phone based Activity Detection using Markov Logic Network Cell Phone based Activity Detection using Markov Logic Network Somdeb Sarkhel sxs104721@utdallas.edu 1 Introduction Mobile devices are becoming increasingly sophisticated and the latest generation of smart

More information

Nonlinear Iterative Partial Least Squares Method

Nonlinear Iterative Partial Least Squares Method Numerical Methods for Determining Principal Component Analysis Abstract Factors Béchu, S., Richard-Plouet, M., Fernandez, V., Walton, J., and Fairley, N. (2016) Developments in numerical treatments for

More information

Data quality improvement in automated water quality measurement stations

Data quality improvement in automated water quality measurement stations Data quality improvement in automated water quality measurement stations Presented by: Atefeh Saberi Co-Author: Janelcy Alferes Castano (M.Sc. Student /Electrical engineering) Supervisors: André Desbiens,

More information

Lecture 5: Singular Value Decomposition SVD (1)

Lecture 5: Singular Value Decomposition SVD (1) EEM3L1: Numerical and Analytical Techniques Lecture 5: Singular Value Decomposition SVD (1) EE3L1, slide 1, Version 4: 25-Sep-02 Motivation for SVD (1) SVD = Singular Value Decomposition Consider the system

More information

CHAPTER IV - BROWNIAN MOTION

CHAPTER IV - BROWNIAN MOTION CHAPTER IV - BROWNIAN MOTION JOSEPH G. CONLON 1. Construction of Brownian Motion There are two ways in which the idea of a Markov chain on a discrete state space can be generalized: (1) The discrete time

More information

Forecasting in supply chains

Forecasting in supply chains 1 Forecasting in supply chains Role of demand forecasting Effective transportation system or supply chain design is predicated on the availability of accurate inputs to the modeling process. One of the

More information

Monte Carlo-based statistical methods (MASM11/FMS091)

Monte Carlo-based statistical methods (MASM11/FMS091) Monte Carlo-based statistical methods (MASM11/FMS091) Magnus Wiktorsson Centre for Mathematical Sciences Lund University, Sweden Lecture 6 Sequential Monte Carlo methods II February 7, 2014 M. Wiktorsson

More information

Signal processing tools for largescale network monitoring: a state of the art

Signal processing tools for largescale network monitoring: a state of the art Signal processing tools for largescale network monitoring: a state of the art Kavé Salamatian Université de Savoie Outline Introduction & Methodologies Anomaly detection basic Statistical anomaly detection

More information

Web-based Supplementary Materials for Bayesian Effect Estimation. Accounting for Adjustment Uncertainty by Chi Wang, Giovanni

Web-based Supplementary Materials for Bayesian Effect Estimation. Accounting for Adjustment Uncertainty by Chi Wang, Giovanni 1 Web-based Supplementary Materials for Bayesian Effect Estimation Accounting for Adjustment Uncertainty by Chi Wang, Giovanni Parmigiani, and Francesca Dominici In Web Appendix A, we provide detailed

More information

Kristine L. Bell and Harry L. Van Trees. Center of Excellence in C 3 I George Mason University Fairfax, VA 22030-4444, USA kbell@gmu.edu, hlv@gmu.

Kristine L. Bell and Harry L. Van Trees. Center of Excellence in C 3 I George Mason University Fairfax, VA 22030-4444, USA kbell@gmu.edu, hlv@gmu. POSERIOR CRAMÉR-RAO BOUND FOR RACKING ARGE BEARING Kristine L. Bell and Harry L. Van rees Center of Excellence in C 3 I George Mason University Fairfax, VA 22030-4444, USA bell@gmu.edu, hlv@gmu.edu ABSRAC

More information

Joint parameter and state estimation algorithms for real-time traffic monitoring

Joint parameter and state estimation algorithms for real-time traffic monitoring USDOT Region V Regional University Transportation Center Final Report NEXTRANS Project No 097IY04. Joint parameter and state estimation algorithms for real-time traffic monitoring By Ren Wang PhD Candidate

More information

Statistical Trend Estimation with Application to Workers Compensation Ratemaking

Statistical Trend Estimation with Application to Workers Compensation Ratemaking Statistical Trend Estimation with Application to Workers Compensation Ratemaking Frank Schmid Abstract Motivation. Estimating the trend of the severity, frequency, and loss ratio rates of growth is an

More information

Stock Option Pricing Using Bayes Filters

Stock Option Pricing Using Bayes Filters Stock Option Pricing Using Bayes Filters Lin Liao liaolin@cs.washington.edu Abstract When using Black-Scholes formula to price options, the key is the estimation of the stochastic return variance. In this

More information

MIMO CHANNEL CAPACITY

MIMO CHANNEL CAPACITY MIMO CHANNEL CAPACITY Ochi Laboratory Nguyen Dang Khoa (D1) 1 Contents Introduction Review of information theory Fixed MIMO channel Fading MIMO channel Summary and Conclusions 2 1. Introduction The use

More information

Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091)

Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091) Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091) Magnus Wiktorsson Centre for Mathematical Sciences Lund University, Sweden Lecture 5 Sequential Monte Carlo methods I February

More information

Since it is necessary to consider the ability of the lter to predict many data over a period of time a more meaningful metric is the expected value of

Since it is necessary to consider the ability of the lter to predict many data over a period of time a more meaningful metric is the expected value of Chapter 11 Tutorial: The Kalman Filter Tony Lacey. 11.1 Introduction The Kalman lter ë1ë has long been regarded as the optimal solution to many tracing and data prediction tass, ë2ë. Its use in the analysis

More information

Component Ordering in Independent Component Analysis Based on Data Power

Component Ordering in Independent Component Analysis Based on Data Power Component Ordering in Independent Component Analysis Based on Data Power Anne Hendrikse Raymond Veldhuis University of Twente University of Twente Fac. EEMCS, Signals and Systems Group Fac. EEMCS, Signals

More information

Linear Threshold Units

Linear Threshold Units Linear Threshold Units w x hx (... w n x n w We assume that each feature x j and each weight w j is a real number (we will relax this later) We will study three different algorithms for learning linear

More information

Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not.

Example: Credit card default, we may be more interested in predicting the probabilty of a default than classifying individuals as default or not. Statistical Learning: Chapter 4 Classification 4.1 Introduction Supervised learning with a categorical (Qualitative) response Notation: - Feature vector X, - qualitative response Y, taking values in C

More information

Accurate and robust image superresolution by neural processing of local image representations

Accurate and robust image superresolution by neural processing of local image representations Accurate and robust image superresolution by neural processing of local image representations Carlos Miravet 1,2 and Francisco B. Rodríguez 1 1 Grupo de Neurocomputación Biológica (GNB), Escuela Politécnica

More information

Bayesian Machine Learning (ML): Modeling And Inference in Big Data. Zhuhua Cai Google, Rice University caizhua@gmail.com

Bayesian Machine Learning (ML): Modeling And Inference in Big Data. Zhuhua Cai Google, Rice University caizhua@gmail.com Bayesian Machine Learning (ML): Modeling And Inference in Big Data Zhuhua Cai Google Rice University caizhua@gmail.com 1 Syllabus Bayesian ML Concepts (Today) Bayesian ML on MapReduce (Next morning) Bayesian

More information

Univariate and Multivariate Methods PEARSON. Addison Wesley

Univariate and Multivariate Methods PEARSON. Addison Wesley Time Series Analysis Univariate and Multivariate Methods SECOND EDITION William W. S. Wei Department of Statistics The Fox School of Business and Management Temple University PEARSON Addison Wesley Boston

More information

10-601. Machine Learning. http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html

10-601. Machine Learning. http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html 10-601 Machine Learning http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html Course data All up-to-date info is on the course web page: http://www.cs.cmu.edu/afs/cs/academic/class/10601-f10/index.html

More information

Communication on the Grassmann Manifold: A Geometric Approach to the Noncoherent Multiple-Antenna Channel

Communication on the Grassmann Manifold: A Geometric Approach to the Noncoherent Multiple-Antenna Channel IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 2, FEBRUARY 2002 359 Communication on the Grassmann Manifold: A Geometric Approach to the Noncoherent Multiple-Antenna Channel Lizhong Zheng, Student

More information

Bayesian Statistics: Indian Buffet Process

Bayesian Statistics: Indian Buffet Process Bayesian Statistics: Indian Buffet Process Ilker Yildirim Department of Brain and Cognitive Sciences University of Rochester Rochester, NY 14627 August 2012 Reference: Most of the material in this note

More information

Unsupervised and supervised dimension reduction: Algorithms and connections

Unsupervised and supervised dimension reduction: Algorithms and connections Unsupervised and supervised dimension reduction: Algorithms and connections Jieping Ye Department of Computer Science and Engineering Evolutionary Functional Genomics Center The Biodesign Institute Arizona

More information

Variations of Statistical Models

Variations of Statistical Models 38. Statistics 1 38. STATISTICS Revised September 2013 by G. Cowan (RHUL). This chapter gives an overview of statistical methods used in high-energy physics. In statistics, we are interested in using a

More information

QUALITY ENGINEERING PROGRAM

QUALITY ENGINEERING PROGRAM QUALITY ENGINEERING PROGRAM Production engineering deals with the practical engineering problems that occur in manufacturing planning, manufacturing processes and in the integration of the facilities and

More information

Performance evaluation of multi-camera visual tracking

Performance evaluation of multi-camera visual tracking Performance evaluation of multi-camera visual tracking Lucio Marcenaro, Pietro Morerio, Mauricio Soto, Andrea Zunino, Carlo S. Regazzoni DITEN, University of Genova Via Opera Pia 11A 16145 Genoa - Italy

More information

Monte Carlo-based statistical methods (MASM11/FMS091)

Monte Carlo-based statistical methods (MASM11/FMS091) Monte Carlo-based statistical methods (MASM11/FMS091) Jimmy Olsson Centre for Mathematical Sciences Lund University, Sweden Lecture 5 Sequential Monte Carlo methods I February 5, 2013 J. Olsson Monte Carlo-based

More information

Sales and operations planning (SOP) Demand forecasting

Sales and operations planning (SOP) Demand forecasting ing, introduction Sales and operations planning (SOP) forecasting To balance supply with demand and synchronize all operational plans Capture demand data forecasting Balancing of supply, demand, and budgets.

More information

Regression Analysis. Regression Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013

Regression Analysis. Regression Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013 Lecture 6: Regression Analysis MIT 18.S096 Dr. Kempthorne Fall 2013 MIT 18.S096 Regression Analysis 1 Outline Regression Analysis 1 Regression Analysis MIT 18.S096 Regression Analysis 2 Multiple Linear

More information

Analyzing Structural Equation Models With Missing Data

Analyzing Structural Equation Models With Missing Data Analyzing Structural Equation Models With Missing Data Craig Enders* Arizona State University cenders@asu.edu based on Enders, C. K. (006). Analyzing structural equation models with missing data. In G.

More information

Hybrid Data and Decision Fusion Techniques for Model-Based Data Gathering in Wireless Sensor Networks

Hybrid Data and Decision Fusion Techniques for Model-Based Data Gathering in Wireless Sensor Networks Hybrid Data and Decision Fusion Techniques for Model-Based Data Gathering in Wireless Sensor Networks Lorenzo A. Rossi, Bhaskar Krishnamachari and C.-C. Jay Kuo Department of Electrical Engineering, University

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka kosecka@cs.gmu.edu http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa Cogsci 8F Linear Algebra review UCSD Vectors The length

More information

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop Music and Machine Learning (IFT6080 Winter 08) Prof. Douglas Eck, Université de Montréal These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher

More information

Bayes and Naïve Bayes. cs534-machine Learning

Bayes and Naïve Bayes. cs534-machine Learning Bayes and aïve Bayes cs534-machine Learning Bayes Classifier Generative model learns Prediction is made by and where This is often referred to as the Bayes Classifier, because of the use of the Bayes rule

More information

Christfried Webers. Canberra February June 2015

Christfried Webers. Canberra February June 2015 c Statistical Group and College of Engineering and Computer Science Canberra February June (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 829 c Part VIII Linear Classification 2 Logistic

More information

ANALYZING INVESTMENT RETURN OF ASSET PORTFOLIOS WITH MULTIVARIATE ORNSTEIN-UHLENBECK PROCESSES

ANALYZING INVESTMENT RETURN OF ASSET PORTFOLIOS WITH MULTIVARIATE ORNSTEIN-UHLENBECK PROCESSES ANALYZING INVESTMENT RETURN OF ASSET PORTFOLIOS WITH MULTIVARIATE ORNSTEIN-UHLENBECK PROCESSES by Xiaofeng Qian Doctor of Philosophy, Boston University, 27 Bachelor of Science, Peking University, 2 a Project

More information

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued). MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors Jordan canonical form (continued) Jordan canonical form A Jordan block is a square matrix of the form λ 1 0 0 0 0 λ 1 0 0 0 0 λ 0 0 J = 0

More information

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written

More information

1. Introduction. Consider the computation of an approximate solution of the minimization problem

1. Introduction. Consider the computation of an approximate solution of the minimization problem A NEW TIKHONOV REGULARIZATION METHOD MARTIN FUHRY AND LOTHAR REICHEL Abstract. The numerical solution of linear discrete ill-posed problems typically requires regularization, i.e., replacement of the available

More information

Cooperative Object Tracking for Many-on-Many Engagement

Cooperative Object Tracking for Many-on-Many Engagement Cooperative Object Tracking for Many-on-Many Engagement Pradeep Bhatta a and Michael A. Paluszek a a Princeton Satellite Systems, 33 Witherspoon Street, Princeton, NJ, USA ABSTRACT This paper presents

More information

Lecture 8: Signal Detection and Noise Assumption

Lecture 8: Signal Detection and Noise Assumption ECE 83 Fall Statistical Signal Processing instructor: R. Nowak, scribe: Feng Ju Lecture 8: Signal Detection and Noise Assumption Signal Detection : X = W H : X = S + W where W N(, σ I n n and S = [s, s,...,

More information

4.3 Least Squares Approximations

4.3 Least Squares Approximations 18 Chapter. Orthogonality.3 Least Squares Approximations It often happens that Ax D b has no solution. The usual reason is: too many equations. The matrix has more rows than columns. There are more equations

More information

The Monte Carlo Framework, Examples from Finance and Generating Correlated Random Variables

The Monte Carlo Framework, Examples from Finance and Generating Correlated Random Variables Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh The Monte Carlo Framework, Examples from Finance and Generating Correlated Random Variables 1 The Monte Carlo Framework Suppose we wish

More information

Statistical machine learning, high dimension and big data

Statistical machine learning, high dimension and big data Statistical machine learning, high dimension and big data S. Gaïffas 1 14 mars 2014 1 CMAP - Ecole Polytechnique Agenda for today Divide and Conquer principle for collaborative filtering Graphical modelling,

More information

Gaussian Conjugate Prior Cheat Sheet

Gaussian Conjugate Prior Cheat Sheet Gaussian Conjugate Prior Cheat Sheet Tom SF Haines 1 Purpose This document contains notes on how to handle the multivariate Gaussian 1 in a Bayesian setting. It focuses on the conjugate prior, its Bayesian

More information

Stability of the LMS Adaptive Filter by Means of a State Equation

Stability of the LMS Adaptive Filter by Means of a State Equation Stability of the LMS Adaptive Filter by Means of a State Equation Vítor H. Nascimento and Ali H. Sayed Electrical Engineering Department University of California Los Angeles, CA 90095 Abstract This work

More information

MCMC Using Hamiltonian Dynamics

MCMC Using Hamiltonian Dynamics 5 MCMC Using Hamiltonian Dynamics Radford M. Neal 5.1 Introduction Markov chain Monte Carlo (MCMC) originated with the classic paper of Metropolis et al. (1953), where it was used to simulate the distribution

More information

CS229 Lecture notes. Andrew Ng

CS229 Lecture notes. Andrew Ng CS229 Lecture notes Andrew Ng Part X Factor analysis Whenwehavedatax (i) R n thatcomesfromamixtureofseveral Gaussians, the EM algorithm can be applied to fit a mixture model. In this setting, we usually

More information

13 MATH FACTS 101. 2 a = 1. 7. The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

13 MATH FACTS 101. 2 a = 1. 7. The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions. 3 MATH FACTS 0 3 MATH FACTS 3. Vectors 3.. Definition We use the overhead arrow to denote a column vector, i.e., a linear segment with a direction. For example, in three-space, we write a vector in terms

More information

Dealing with large datasets

Dealing with large datasets Dealing with large datasets (by throwing away most of the data) Alan Heavens Institute for Astronomy, University of Edinburgh with Ben Panter, Rob Tweedie, Mark Bastin, Will Hossack, Keith McKellar, Trevor

More information

BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES

BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES 123 CHAPTER 7 BEHAVIOR BASED CREDIT CARD FRAUD DETECTION USING SUPPORT VECTOR MACHINES 7.1 Introduction Even though using SVM presents

More information

Weighted-Least-Square(WLS) State Estimation

Weighted-Least-Square(WLS) State Estimation Weighted-Least-Square(WLS) State Estimation Yousu Chen PNNL December 18, 2015 This document is a description of how to formulate the weighted-least squares (WLS) state estimation problem. Most of the formulation

More information

19 LINEAR QUADRATIC REGULATOR

19 LINEAR QUADRATIC REGULATOR 19 LINEAR QUADRATIC REGULATOR 19.1 Introduction The simple form of loopshaping in scalar systems does not extend directly to multivariable (MIMO) plants, which are characterized by transfer matrices instead

More information

Robust Neural Network Regression for Offline and Online Learning

Robust Neural Network Regression for Offline and Online Learning Robust Neural Network Regression for Offline and Online Learning homas Briegel* Siemens AG, Corporate echnology D-81730 Munich, Germany thomas.briegel@mchp.siemens.de Volker resp Siemens AG, Corporate

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION Introduction In the previous chapter, we explored a class of regression models having particularly simple analytical

More information

Imputing Missing Data using SAS

Imputing Missing Data using SAS ABSTRACT Paper 3295-2015 Imputing Missing Data using SAS Christopher Yim, California Polytechnic State University, San Luis Obispo Missing data is an unfortunate reality of statistics. However, there are

More information