Nonlinear Blind Source Separation and Independent Component Analysis
|
|
- Alexandrina Houston
- 7 years ago
- Views:
Transcription
1 Nonlinear Blind Source Separation and Independent Component Analysis Prof. Juha Karhunen Helsinki University of Technology Neural Networks Research Centre Espoo, Finland Helsinki University of Technology, Espoo, Finland 1
2 Part I: Linear Independent Component Analysis and Blind Source Separation Helsinki University of Technology, Espoo, Finland 2
3 Motivation for independent component analysis (ICA) and blind source separation (BSS) Let us start with an example: three people are speaking simultaneously in a room that has three microphones. Denote the microphone signals by x 1 (t), x 2 (t), and x 3 (t). Each is a weighted sum of the speech signals which we denote by s 1 (t), s 2 (t), and s 3 (t): x 1 (t) = a 11 s 1 (t) + a 12 s 2 (t) + a 13 s 3 (t) (1) x 2 (t) = a 21 s 1 (t) + a 22 s 2 (t) + a 23 s 3 (t) (2) x 3 (t) = a 31 s 1 (t) + a 32 s 2 (t) + a 33 s 3 (t) (3) Cocktail-party problem: estimate the original speech signals using only the recorded signals. Helsinki University of Technology, Espoo, Finland 3
4 Figure 1: The original speech waveforms. Helsinki University of Technology, Espoo, Finland 4
5 Figure 2: The observed microphone signals. Helsinki University of Technology, Espoo, Finland 5
6 The problem: find the sources s 1 (t), s 2 (t) and s 3 (t) from the observed signals x 1 (t), x 2 (t) and x 3 (t). As the weights a ij are different, we may assume that the matrix A = (a ij ) (although unknown) is invertible. Thus there exist another set of weights w ij such that s 1 (t) = w 11 x 1 (t) + w 12 x 2 (t) + w 13 x 3 (t) (4) s 2 (t) = w 21 x 1 (t) + w 22 x 2 (t) + w 23 x 3 (t) s 3 (t) = w 31 x 1 (t) + w 32 x 2 (t) + w 33 x 3 (t) It turns out that this blind source separation (BSS) problem can be solved using independent component analysis (ICA). In ICA, it suffices to assume that the sources s j (t) are nongaussian and statistically independent. Helsinki University of Technology, Espoo, Finland 6
7 Figure 3: The estimates of the speech waveforms obtained by ICA. Helsinki University of Technology, Espoo, Finland 7
8 Definition of Independent Component Analysis ICA model is a statistical latent variable model x i = a i1 s 1 + a i2 s a in s n, for all i = 1,..., n (5) where the a ij, i, j = 1,..., n are some real coefficients. This is the basic linear ICA model, which can be extended in many ways. In the basic ICA model, we assume that each mixture x i as well as each independent component s j is a random variable. Using vector-matrix formulation: let x = (x 1,..., x n ) T, s = (s 1,..., s n ) T, A = (a ij ) (6) Then the basic ICA model is x = As (7) Helsinki University of Technology, Espoo, Finland 8
9 If the columns of A are denoted a j, the model can also be written as x = n a i s i (8) i=1 There are some basic assumptions or restrictions in the model. 1. The independent components are assumed statistically independent. 2. The independent components must have nongaussian distributions. - In the basic ICA, we need not know them. 3. In the basic ICA, the unknown mixing matrix A is square. - In other words, the number of independent components is equal to the number of observed mixtures. - This assumption can be relaxed by allowing more or less mixtures than independent components. Helsinki University of Technology, Espoo, Finland 9
10 Indeterminacies in the basic ICA model: scaling, sign, and order of the independent components. That is, only the waveforms of the independent components can be recovered without further information. Methods for linear ICA Independent components are usually estimated by trying to find an inverse separating matrix B. The components of the vector should be statistically independent. Ideally, B = A 1. y = Bx (9) Even though the ICA model x = As is linear and simple, the Helsinki University of Technology, Espoo, Finland 10
11 problem is difficult because of its blind nature. Higher-order statistics are needed for ICA. Using second-order statistics (covariances) provides uncorrelated components only. There exist infinitely many such uncorrelated solutions; most of them are quite different from ICA. However, prewhitening the data vectors x so that their components become uncorrelated is a useful preprocessing step. After that, the separating matrix B becomes orthogonal. Many methods for linear ICA now exist; the most popular of them are: The natural gradient algorithm B = µ[i g(y)y T ]B (10) Helsinki University of Technology, Espoo, Finland 11
12 - Here g(y) is a suitable nonlinearity applied to the components of the output vector y. - A simple adaptive neural algorithm, well justified theoretically. Fixed-point (FastICA) algorithms. - Fast batch algorithms applicable for large-scale problems. For more information, see our new 500-page textbook/monograph: A. Hyvärinen, J. Karhunen, and E. Oja, Independent Component Analysis, Wiley Linear blind source separation (BSS) In linear blind source separation (BSS), one tries to separate the original source signals from their linear mixtures. Assuming that the sources are independent and the mixing model is linear, x = As, one can apply linear ICA methods directly to BSS. Helsinki University of Technology, Espoo, Finland 12
13 Another major group of linear BSS methods utilizes time structure of the sources. Second-order temporal statistics are then sufficient for achieving blind separation. The sources can be even Gaussian provided that they have different autocorrelation sequences. ICA neglects possible temporal structure of the sources or independent components, treating them as random variables. On the other hand, it works for temporally uncorrelated sources. Ideally, both spatial independence and temporal structure should be taken into account in estimation. Helsinki University of Technology, Espoo, Finland 13
14 Practical applications of ICA The cocktail party problem : separation of voices or music or sounds. Sensor array processing, e.g. radar. Biomedical signal processing with multiple sensors: EEG, ECG, MEG, fmri. Telecommunications: e.g. multiuser detection in CDMA. Financial and other time series. Noise removal from signals and images. Feature extraction for images and signals. Projection pursuit: finding interesting projections from the data for visualizing it in two dimensions. Helsinki University of Technology, Espoo, Finland 14
15 Figure 4: Basis functions in ICA of natural images. These basis functions can be considered as the independent features of images. Every image window is a linear sum of these windows. Helsinki University of Technology, Espoo, Finland 15
16 MEG 1000 ft/cm EOG ECG 500 µv 500 µv saccades blinking biting MEG VEOG HEOG ECG 10 s Figure 5: 12 magnetic brain (MEG) signals containing various artifacts: ocular and muscle activity, the cardiac cycle, and magnetic disturbances. Helsinki University of Technology, Espoo, Finland 16
17 Magnitude Time delay Figure 6: An example of multipath propagation in urban environment. Helsinki University of Technology, Espoo, Finland 17
18 Extensions of basic linear ICA Noisy ICA; estimation of the mixing matrix and independent components requires more sophisticated methods. Overcomplete bases: the number of independent components is larger than the number of mixtures. Taking into account the temporal structure in the data. ICA and BSS for nonlinear mixture models. Separation of convolutive mixtures containing time delays. Separation of correlated or non-independent sources. Nonstationary sources, time dependent mixing matrices. Semi-blind problems: some prior information on the source signals and/or mixtures is available. Helsinki University of Technology, Espoo, Finland 18
Modelling, Extraction and Description of Intrinsic Cues of High Resolution Satellite Images: Independent Component Analysis based approaches
Modelling, Extraction and Description of Intrinsic Cues of High Resolution Satellite Images: Independent Component Analysis based approaches PhD Thesis by Payam Birjandi Director: Prof. Mihai Datcu Problematic
More informationComponent Ordering in Independent Component Analysis Based on Data Power
Component Ordering in Independent Component Analysis Based on Data Power Anne Hendrikse Raymond Veldhuis University of Twente University of Twente Fac. EEMCS, Signals and Systems Group Fac. EEMCS, Signals
More informationIndependent Component Analysis: Algorithms and Applications
Independent Component Analysis: Algorithms and Applications Aapo Hyvärinen and Erkki Oja Neural Networks Research Centre Helsinki University of Technology P.O. Box 5400, FIN-02015 HUT, Finland Neural Networks,
More informationBLIND SOURCE SEPARATION OF SPEECH AND BACKGROUND MUSIC FOR IMPROVED SPEECH RECOGNITION
BLIND SOURCE SEPARATION OF SPEECH AND BACKGROUND MUSIC FOR IMPROVED SPEECH RECOGNITION P. Vanroose Katholieke Universiteit Leuven, div. ESAT/PSI Kasteelpark Arenberg 10, B 3001 Heverlee, Belgium Peter.Vanroose@esat.kuleuven.ac.be
More informationADVANCES IN INDEPENDENT COMPONENT ANALYSIS WITH APPLICATIONS TO DATA MINING
Helsinki University of Technology Dissertations in Computer and Information Science Espoo 2003 Report D4 ADVANCES IN INDEPENDENT COMPONENT ANALYSIS WITH APPLICATIONS TO DATA MINING Ella Bingham Dissertation
More informationUnderstanding and Applying Kalman Filtering
Understanding and Applying Kalman Filtering Lindsay Kleeman Department of Electrical and Computer Systems Engineering Monash University, Clayton 1 Introduction Objectives: 1. Provide a basic understanding
More informationSolving simultaneous equations using the inverse matrix
Solving simultaneous equations using the inverse matrix 8.2 Introduction The power of matrix algebra is seen in the representation of a system of simultaneous linear equations as a matrix equation. Matrix
More informationAdvanced Signal Processing and Digital Noise Reduction
Advanced Signal Processing and Digital Noise Reduction Saeed V. Vaseghi Queen's University of Belfast UK WILEY HTEUBNER A Partnership between John Wiley & Sons and B. G. Teubner Publishers Chichester New
More informationMATRIX ALGEBRA AND SYSTEMS OF EQUATIONS
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS Systems of Equations and Matrices Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a
More information4F7 Adaptive Filters (and Spectrum Estimation) Least Mean Square (LMS) Algorithm Sumeetpal Singh Engineering Department Email : sss40@eng.cam.ac.
4F7 Adaptive Filters (and Spectrum Estimation) Least Mean Square (LMS) Algorithm Sumeetpal Singh Engineering Department Email : sss40@eng.cam.ac.uk 1 1 Outline The LMS algorithm Overview of LMS issues
More information8.2. Solution by Inverse Matrix Method. Introduction. Prerequisites. Learning Outcomes
Solution by Inverse Matrix Method 8.2 Introduction The power of matrix algebra is seen in the representation of a system of simultaneous linear equations as a matrix equation. Matrix algebra allows us
More informationTIETS34 Seminar: Data Mining on Biometric identification
TIETS34 Seminar: Data Mining on Biometric identification Youming Zhang Computer Science, School of Information Sciences, 33014 University of Tampere, Finland Youming.Zhang@uta.fi Course Description Content
More informationUnknown n sensors x(t)
Appeared in journal: Neural Network World Vol.6, No.4, 1996, pp.515{523. Published by: IDG Co., Prague, Czech Republic. LOCAL ADAPTIVE LEARNING ALGORITHMS FOR BLIND SEPARATION OF NATURAL IMAGES Andrzej
More informationTime Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication
Time Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication Thomas Reilly Data Physics Corporation 1741 Technology Drive, Suite 260 San Jose, CA 95110 (408) 216-8440 This paper
More informationApplications to Data Smoothing and Image Processing I
Applications to Data Smoothing and Image Processing I MA 348 Kurt Bryan Signals and Images Let t denote time and consider a signal a(t) on some time interval, say t. We ll assume that the signal a(t) is
More informationSolving Systems of Linear Equations Using Matrices
Solving Systems of Linear Equations Using Matrices What is a Matrix? A matrix is a compact grid or array of numbers. It can be created from a system of equations and used to solve the system of equations.
More informationIntroduction to Matrix Algebra
Psychology 7291: Multivariate Statistics (Carey) 8/27/98 Matrix Algebra - 1 Introduction to Matrix Algebra Definitions: A matrix is a collection of numbers ordered by rows and columns. It is customary
More informationCCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York
BME I5100: Biomedical Signal Processing Linear Discrimination Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not
More informationTheCAM Software for Nonnegative Blind Source Separation inr-java
Journal of Machine Learning Research 14 (2013) 2899-2903 Submitted 11/12; Revised 4/13; Published 9/13 TheCAM Software for Nonnegative Blind Source Separation inr-java Niya Wang Fan Meng Department of
More informationForecasting the U.S. Stock Market via Levenberg-Marquardt and Haken Artificial Neural Networks Using ICA&PCA Pre-Processing Techniques
Forecasting the U.S. Stock Market via Levenberg-Marquardt and Haken Artificial Neural Networks Using ICA&PCA Pre-Processing Techniques Golovachev Sergey National Research University, Higher School of Economics,
More informationCS 2750 Machine Learning. Lecture 1. Machine Learning. http://www.cs.pitt.edu/~milos/courses/cs2750/ CS 2750 Machine Learning.
Lecture Machine Learning Milos Hauskrecht milos@cs.pitt.edu 539 Sennott Square, x5 http://www.cs.pitt.edu/~milos/courses/cs75/ Administration Instructor: Milos Hauskrecht milos@cs.pitt.edu 539 Sennott
More informationSolving Linear Systems, Continued and The Inverse of a Matrix
, Continued and The of a Matrix Calculus III Summer 2013, Session II Monday, July 15, 2013 Agenda 1. The rank of a matrix 2. The inverse of a square matrix Gaussian Gaussian solves a linear system by reducing
More informationHigh Quality Image Deblurring Panchromatic Pixels
High Quality Image Deblurring Panchromatic Pixels ACM Transaction on Graphics vol. 31, No. 5, 2012 Sen Wang, Tingbo Hou, John Border, Hong Qin, and Rodney Miller Presented by Bong-Seok Choi School of Electrical
More informationAccurate and robust image superresolution by neural processing of local image representations
Accurate and robust image superresolution by neural processing of local image representations Carlos Miravet 1,2 and Francisco B. Rodríguez 1 1 Grupo de Neurocomputación Biológica (GNB), Escuela Politécnica
More informationProbability and Random Variables. Generation of random variables (r.v.)
Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly
More informationOverview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written
More informationLeast-Squares Intersection of Lines
Least-Squares Intersection of Lines Johannes Traa - UIUC 2013 This write-up derives the least-squares solution for the intersection of lines. In the general case, a set of lines will not intersect at a
More informationMATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS 1. SYSTEMS OF EQUATIONS AND MATRICES 1.1. Representation of a linear system. The general system of m equations in n unknowns can be written a 11 x 1 + a 12 x 2 +
More informationHandling of incomplete data sets using ICA and SOM in data mining
Neural Comput & Applic (2007) 16: 167 172 DOI 10.1007/s00521-006-0058-6 ORIGINAL ARTICLE Hongyi Peng Æ Siming Zhu Handling of incomplete data sets using ICA and SOM in data mining Received: 2 September
More informationUsing row reduction to calculate the inverse and the determinant of a square matrix
Using row reduction to calculate the inverse and the determinant of a square matrix Notes for MATH 0290 Honors by Prof. Anna Vainchtein 1 Inverse of a square matrix An n n square matrix A is called invertible
More informationA Wavelet Based Prediction Method for Time Series
A Wavelet Based Prediction Method for Time Series Cristina Stolojescu 1,2 Ion Railean 1,3 Sorin Moga 1 Philippe Lenca 1 and Alexandru Isar 2 1 Institut TELECOM; TELECOM Bretagne, UMR CNRS 3192 Lab-STICC;
More informationDepartment of Chemical Engineering ChE-101: Approaches to Chemical Engineering Problem Solving MATLAB Tutorial VI
Department of Chemical Engineering ChE-101: Approaches to Chemical Engineering Problem Solving MATLAB Tutorial VI Solving a System of Linear Algebraic Equations (last updated 5/19/05 by GGB) Objectives:
More informationQuestion 2: How do you solve a matrix equation using the matrix inverse?
Question : How do you solve a matrix equation using the matrix inverse? In the previous question, we wrote systems of equations as a matrix equation AX B. In this format, the matrix A contains the coefficients
More informationNatural Image Statistics
Aapo Hyvärinen Jarmo Hurri Patrik O. Hoyer Natural Image Statistics A probabilistic approach to early computational vision February 27, 2009 Springer Contents overview 1 Introduction...................................................
More informationSolving Systems of Linear Equations
LECTURE 5 Solving Systems of Linear Equations Recall that we introduced the notion of matrices as a way of standardizing the expression of systems of linear equations In today s lecture I shall show how
More informationWeather Data Mining Using Independent Component Analysis
Journal of Machine Learning Research 5 (2004) 239 253 Submitted 9/02; Revised 7/03; Published 3/04 Weather Data Mining Using Independent Component Analysis Jayanta Basak IBM India Research Lab Block I,
More informationSTUDY OF MUTUAL INFORMATION IN PERCEPTUAL CODING WITH APPLICATION FOR LOW BIT-RATE COMPRESSION
STUDY OF MUTUAL INFORMATION IN PERCEPTUAL CODING WITH APPLICATION FOR LOW BIT-RATE COMPRESSION Adiel Ben-Shalom, Michael Werman School of Computer Science Hebrew University Jerusalem, Israel. {chopin,werman}@cs.huji.ac.il
More informationBlind source separation of multichannel neuromagnetic responses
Neurocomputing 32}33 (2000) 1115}1120 Blind source separation of multichannel neuromagnetic responses Akaysha C. Tang *, Barak A. Pearlmutter, Michael Zibulevsky, Scott A. Carter Department of Psychology,
More information1 Determinants and the Solvability of Linear Systems
1 Determinants and the Solvability of Linear Systems In the last section we learned how to use Gaussian elimination to solve linear systems of n equations in n unknowns The section completely side-stepped
More informationEnhancing the SNR of the Fiber Optic Rotation Sensor using the LMS Algorithm
1 Enhancing the SNR of the Fiber Optic Rotation Sensor using the LMS Algorithm Hani Mehrpouyan, Student Member, IEEE, Department of Electrical and Computer Engineering Queen s University, Kingston, Ontario,
More information1 Introduction to Matrices
1 Introduction to Matrices In this section, important definitions and results from matrix algebra that are useful in regression analysis are introduced. While all statements below regarding the columns
More informationPrinciple Component Analysis and Partial Least Squares: Two Dimension Reduction Techniques for Regression
Principle Component Analysis and Partial Least Squares: Two Dimension Reduction Techniques for Regression Saikat Maitra and Jun Yan Abstract: Dimension reduction is one of the major tasks for multivariate
More informationSystem Identification for Acoustic Comms.:
System Identification for Acoustic Comms.: New Insights and Approaches for Tracking Sparse and Rapidly Fluctuating Channels Weichang Li and James Preisig Woods Hole Oceanographic Institution The demodulation
More informationA NOVEL DETERMINISTIC METHOD FOR LARGE-SCALE BLIND SOURCE SEPARATION
A NOVEL DETERMINISTIC METHOD FOR LARGE-SCALE BLIND SOURCE SEPARATION Martijn Boussé Otto Debals Lieven De Lathauwer Department of Electrical Engineering (ESAT, KU Leuven, Kasteelpark Arenberg 10, 3001
More informationModel-free Functional Data Analysis MELODIC Multivariate Exploratory Linear Optimised Decomposition into Independent Components
Model-free Functional Data Analysis MELODIC Multivariate Exploratory Linear Optimised Decomposition into Independent Components decomposes data into a set of statistically independent spatial component
More informationArtificial Neural Network for Speech Recognition
Artificial Neural Network for Speech Recognition Austin Marshall March 3, 2005 2nd Annual Student Research Showcase Overview Presenting an Artificial Neural Network to recognize and classify speech Spoken
More informationRevision of Lecture Eighteen
Revision of Lecture Eighteen Previous lecture has discussed equalisation using Viterbi algorithm: Note similarity with channel decoding using maximum likelihood sequence estimation principle It also discusses
More informationUnivariate and Multivariate Methods PEARSON. Addison Wesley
Time Series Analysis Univariate and Multivariate Methods SECOND EDITION William W. S. Wei Department of Statistics The Fox School of Business and Management Temple University PEARSON Addison Wesley Boston
More informationFactor analysis. Angela Montanari
Factor analysis Angela Montanari 1 Introduction Factor analysis is a statistical model that allows to explain the correlations between a large number of observed correlated variables through a small number
More information1 Review of Least Squares Solutions to Overdetermined Systems
cs4: introduction to numerical analysis /9/0 Lecture 7: Rectangular Systems and Numerical Integration Instructor: Professor Amos Ron Scribes: Mark Cowlishaw, Nathanael Fillmore Review of Least Squares
More informationSignal to Noise Instrumental Excel Assignment
Signal to Noise Instrumental Excel Assignment Instrumental methods, as all techniques involved in physical measurements, are limited by both the precision and accuracy. The precision and accuracy of a
More informationThe Method of Least Squares
Hervé Abdi 1 1 Introduction The least square methods (LSM) is probably the most popular technique in statistics. This is due to several factors. First, most common estimators can be casted within this
More informationWe shall turn our attention to solving linear systems of equations. Ax = b
59 Linear Algebra We shall turn our attention to solving linear systems of equations Ax = b where A R m n, x R n, and b R m. We already saw examples of methods that required the solution of a linear system
More informationSparse Component Analysis: a New Tool for Data Mining
Sparse Component Analysis: a New Tool for Data Mining Pando Georgiev, Fabian Theis, Andrzej Cichocki 3, and Hovagim Bakardjian 3 ECECS Department, University of Cincinnati Cincinnati, OH 4 USA pgeorgie@ececs.uc.edu
More informationMachine Learning for Data Science (CS4786) Lecture 1
Machine Learning for Data Science (CS4786) Lecture 1 Tu-Th 10:10 to 11:25 AM Hollister B14 Instructors : Lillian Lee and Karthik Sridharan ROUGH DETAILS ABOUT THE COURSE Diagnostic assignment 0 is out:
More informationAdaptive Equalization of binary encoded signals Using LMS Algorithm
SSRG International Journal of Electronics and Communication Engineering (SSRG-IJECE) volume issue7 Sep Adaptive Equalization of binary encoded signals Using LMS Algorithm Dr.K.Nagi Reddy Professor of ECE,NBKR
More informationMUSICAL INSTRUMENT FAMILY CLASSIFICATION
MUSICAL INSTRUMENT FAMILY CLASSIFICATION Ricardo A. Garcia Media Lab, Massachusetts Institute of Technology 0 Ames Street Room E5-40, Cambridge, MA 039 USA PH: 67-53-0 FAX: 67-58-664 e-mail: rago @ media.
More information1 Short Introduction to Time Series
ECONOMICS 7344, Spring 202 Bent E. Sørensen January 24, 202 Short Introduction to Time Series A time series is a collection of stochastic variables x,.., x t,.., x T indexed by an integer value t. The
More informationWater Leakage Detection in Dikes by Fiber Optic
Water Leakage Detection in Dikes by Fiber Optic Jerome Mars, Amir Ali Khan, Valeriu Vrabie, Alexandre Girard, Guy D Urso To cite this version: Jerome Mars, Amir Ali Khan, Valeriu Vrabie, Alexandre Girard,
More informationGeneral Framework for an Iterative Solution of Ax b. Jacobi s Method
2.6 Iterative Solutions of Linear Systems 143 2.6 Iterative Solutions of Linear Systems Consistent linear systems in real life are solved in one of two ways: by direct calculation (using a matrix factorization,
More informationv w is orthogonal to both v and w. the three vectors v, w and v w form a right-handed set of vectors.
3. Cross product Definition 3.1. Let v and w be two vectors in R 3. The cross product of v and w, denoted v w, is the vector defined as follows: the length of v w is the area of the parallelogram with
More informationPalmprint Recognition with PCA and ICA
Abstract Palmprint Recognition with PCA and ICA Tee Connie, Andrew Teoh, Michael Goh, David Ngo Faculty of Information Sciences and Technology, Multimedia University, Melaka, Malaysia tee.connie@mmu.edu.my
More informationVector and Matrix Norms
Chapter 1 Vector and Matrix Norms 11 Vector Spaces Let F be a field (such as the real numbers, R, or complex numbers, C) with elements called scalars A Vector Space, V, over the field F is a non-empty
More informationA Multi-Model Filter for Mobile Terminal Location Tracking
A Multi-Model Filter for Mobile Terminal Location Tracking M. McGuire, K.N. Plataniotis The Edward S. Rogers Sr. Department of Electrical and Computer Engineering, University of Toronto, 1 King s College
More informationLinear Algebra Notes for Marsden and Tromba Vector Calculus
Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of
More informationUnsupervised and supervised dimension reduction: Algorithms and connections
Unsupervised and supervised dimension reduction: Algorithms and connections Jieping Ye Department of Computer Science and Engineering Evolutionary Functional Genomics Center The Biodesign Institute Arizona
More informationDecember 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS
December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B KITCHENS The equation 1 Lines in two-dimensional space (1) 2x y = 3 describes a line in two-dimensional space The coefficients of x and y in the equation
More informationArtificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence
Artificial Neural Networks and Support Vector Machines CS 486/686: Introduction to Artificial Intelligence 1 Outline What is a Neural Network? - Perceptron learners - Multi-layer networks What is a Support
More informationCapacity Limits of MIMO Channels
Tutorial and 4G Systems Capacity Limits of MIMO Channels Markku Juntti Contents 1. Introduction. Review of information theory 3. Fixed MIMO channels 4. Fading MIMO channels 5. Summary and Conclusions References
More informationChapter 4: Vector Autoregressive Models
Chapter 4: Vector Autoregressive Models 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and und Econometrics Ökonometrie IV.1 Vector Autoregressive Models (VAR)...
More informationLinear Threshold Units
Linear Threshold Units w x hx (... w n x n w We assume that each feature x j and each weight w j is a real number (we will relax this later) We will study three different algorithms for learning linear
More informationLecture 5: Variants of the LMS algorithm
1 Standard LMS Algorithm FIR filters: Lecture 5: Variants of the LMS algorithm y(n) = w 0 (n)u(n)+w 1 (n)u(n 1) +...+ w M 1 (n)u(n M +1) = M 1 k=0 w k (n)u(n k) =w(n) T u(n), Error between filter output
More informationIntroduction to Principal Components and FactorAnalysis
Introduction to Principal Components and FactorAnalysis Multivariate Analysis often starts out with data involving a substantial number of correlated variables. Principal Component Analysis (PCA) is a
More informationMachine Learning. 01 - Introduction
Machine Learning 01 - Introduction Machine learning course One lecture (Wednesday, 9:30, 346) and one exercise (Monday, 17:15, 203). Oral exam, 20 minutes, 5 credit points. Some basic mathematical knowledge
More informationLecture 5 Least-squares
EE263 Autumn 2007-08 Stephen Boyd Lecture 5 Least-squares least-squares (approximate) solution of overdetermined equations projection and orthogonality principle least-squares estimation BLUE property
More informationLecture 8: Signal Detection and Noise Assumption
ECE 83 Fall Statistical Signal Processing instructor: R. Nowak, scribe: Feng Ju Lecture 8: Signal Detection and Noise Assumption Signal Detection : X = W H : X = S + W where W N(, σ I n n and S = [s, s,...,
More informationSolving Mass Balances using Matrix Algebra
Page: 1 Alex Doll, P.Eng, Alex G Doll Consulting Ltd. http://www.agdconsulting.ca Abstract Matrix Algebra, also known as linear algebra, is well suited to solving material balance problems encountered
More informationDynamic data processing
Dynamic data processing recursive least-squares P.J.G. Teunissen Series on Mathematical Geodesy and Positioning Dynamic data processing recursive least-squares Dynamic data processing recursive least-squares
More informationName: Section Registered In:
Name: Section Registered In: Math 125 Exam 3 Version 1 April 24, 2006 60 total points possible 1. (5pts) Use Cramer s Rule to solve 3x + 4y = 30 x 2y = 8. Be sure to show enough detail that shows you are
More informationPrincipal Component Analysis
Principal Component Analysis ERS70D George Fernandez INTRODUCTION Analysis of multivariate data plays a key role in data analysis. Multivariate data consists of many different attributes or variables recorded
More informationRandom Projection-based Multiplicative Data Perturbation for Privacy Preserving Distributed Data Mining
Random Projection-based Multiplicative Data Perturbation for Privacy Preserving Distributed Data Mining Kun Liu Hillol Kargupta and Jessica Ryan Abstract This paper explores the possibility of using multiplicative
More informationNonlinear Iterative Partial Least Squares Method
Numerical Methods for Determining Principal Component Analysis Abstract Factors Béchu, S., Richard-Plouet, M., Fernandez, V., Walton, J., and Fairley, N. (2016) Developments in numerical treatments for
More informationSYSTEMS OF EQUATIONS AND MATRICES WITH THE TI-89. by Joseph Collison
SYSTEMS OF EQUATIONS AND MATRICES WITH THE TI-89 by Joseph Collison Copyright 2000 by Joseph Collison All rights reserved Reproduction or translation of any part of this work beyond that permitted by Sections
More information2x + y = 3. Since the second equation is precisely the same as the first equation, it is enough to find x and y satisfying the system
1. Systems of linear equations We are interested in the solutions to systems of linear equations. A linear equation is of the form 3x 5y + 2z + w = 3. The key thing is that we don t multiply the variables
More informationClarify Some Issues on the Sparse Bayesian Learning for Sparse Signal Recovery
Clarify Some Issues on the Sparse Bayesian Learning for Sparse Signal Recovery Zhilin Zhang and Bhaskar D. Rao Technical Report University of California at San Diego September, Abstract Sparse Bayesian
More informationMedical Image Processing on the GPU. Past, Present and Future. Anders Eklund, PhD Virginia Tech Carilion Research Institute andek@vtc.vt.
Medical Image Processing on the GPU Past, Present and Future Anders Eklund, PhD Virginia Tech Carilion Research Institute andek@vtc.vt.edu Outline Motivation why do we need GPUs? Past - how was GPU programming
More informationTypical Linear Equation Set and Corresponding Matrices
EWE: Engineering With Excel Larsen Page 1 4. Matrix Operations in Excel. Matrix Manipulations: Vectors, Matrices, and Arrays. How Excel Handles Matrix Math. Basic Matrix Operations. Solving Systems of
More informationSystems of Linear Equations
Systems of Linear Equations Beifang Chen Systems of linear equations Linear systems A linear equation in variables x, x,, x n is an equation of the form a x + a x + + a n x n = b, where a, a,, a n and
More informationMAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =
MAT 200, Midterm Exam Solution. (0 points total) a. (5 points) Compute the determinant of the matrix 2 2 0 A = 0 3 0 3 0 Answer: det A = 3. The most efficient way is to develop the determinant along the
More informationReview Jeopardy. Blue vs. Orange. Review Jeopardy
Review Jeopardy Blue vs. Orange Review Jeopardy Jeopardy Round Lectures 0-3 Jeopardy Round $200 How could I measure how far apart (i.e. how different) two observations, y 1 and y 2, are from each other?
More informationChapter 4: Artificial Neural Networks
Chapter 4: Artificial Neural Networks CS 536: Machine Learning Littman (Wu, TA) Administration icml-03: instructional Conference on Machine Learning http://www.cs.rutgers.edu/~mlittman/courses/ml03/icml03/
More informationIN current film media, the increase in areal density has
IEEE TRANSACTIONS ON MAGNETICS, VOL. 44, NO. 1, JANUARY 2008 193 A New Read Channel Model for Patterned Media Storage Seyhan Karakulak, Paul H. Siegel, Fellow, IEEE, Jack K. Wolf, Life Fellow, IEEE, and
More informationOptimization Modeling for Mining Engineers
Optimization Modeling for Mining Engineers Alexandra M. Newman Division of Economics and Business Slide 1 Colorado School of Mines Seminar Outline Linear Programming Integer Linear Programming Slide 2
More informationNatural Conjugate Gradient in Variational Inference
Natural Conjugate Gradient in Variational Inference Antti Honkela, Matti Tornio, Tapani Raiko, and Juha Karhunen Adaptive Informatics Research Centre, Helsinki University of Technology P.O. Box 5400, FI-02015
More informationIndependent component ordering in ICA time series analysis
Neurocomputing 41 (2001) 145}152 Independent component ordering in ICA time series analysis Yiu-ming Cheung*, Lei Xu Department of Computer Science and Engineering, The Chinese University of Hong Kong,
More informationSoft Clustering with Projections: PCA, ICA, and Laplacian
1 Soft Clustering with Projections: PCA, ICA, and Laplacian David Gleich and Leonid Zhukov Abstract In this paper we present a comparison of three projection methods that use the eigenvectors of a matrix
More informationFEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL
FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint
More informationSimple and efficient online algorithms for real world applications
Simple and efficient online algorithms for real world applications Università degli Studi di Milano Milano, Italy Talk @ Centro de Visión por Computador Something about me PhD in Robotics at LIRA-Lab,
More information1 Teaching notes on GMM 1.
Bent E. Sørensen January 23, 2007 1 Teaching notes on GMM 1. Generalized Method of Moment (GMM) estimation is one of two developments in econometrics in the 80ies that revolutionized empirical work in
More informationPredict the Popularity of YouTube Videos Using Early View Data
000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 045 046 047 048 049 050
More information