Advanced Signal Processing and Digital Noise Reduction



Similar documents
Univariate and Multivariate Methods PEARSON. Addison Wesley

Digital Communications

Statistical Modeling by Wavelets

Hardware Implementation of Probabilistic State Machine for Word Recognition

CCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York

Adaptive Equalization of binary encoded signals Using LMS Algorithm

L9: Cepstral analysis

STA 4273H: Statistical Machine Learning

COURSE DESCRIPTOR Signal Processing II Signalbehandling II 7.5 ECTS credit points (7,5 högskolepoäng)

Solutions to Exam in Speech Signal Processing EN2300

System Identification for Acoustic Comms.:

NEURAL NETWORKS A Comprehensive Foundation

CS 2750 Machine Learning. Lecture 1. Machine Learning. CS 2750 Machine Learning.

Speech: A Challenge to Digital Signal Processing Technology for Human-to-Computer Interaction

Enhancing the SNR of the Fiber Optic Rotation Sensor using the LMS Algorithm

Statistics Graduate Courses

AN INTRODUCTION TO NUMERICAL METHODS AND ANALYSIS

MUSICAL INSTRUMENT FAMILY CLASSIFICATION

By choosing to view this document, you agree to all provisions of the copyright laws protecting it.

Speech Signal Processing: An Overview

Principles of Digital Communication

Time Series Analysis in WinIDAMS

Ericsson T18s Voice Dialing Simulator

Final Year Project Progress Report. Frequency-Domain Adaptive Filtering. Myles Friel. Supervisor: Dr.Edward Jones

BLIND SOURCE SEPARATION OF SPEECH AND BACKGROUND MUSIC FOR IMPROVED SPEECH RECOGNITION

An introduction to OBJECTIVE ASSESSMENT OF IMAGE QUALITY. Harrison H. Barrett University of Arizona Tucson, AZ

Numerical Analysis An Introduction

Course: Model, Learning, and Inference: Lecture 5

NONLINEAR TIME SERIES ANALYSIS

Chapter 10 Introduction to Time Series Analysis

ANALYZER BASICS WHAT IS AN FFT SPECTRUM ANALYZER? 2-1

MATHEMATICAL METHODS OF STATISTICS

Multivariate Statistical Inference and Applications

Probability and Random Variables. Generation of random variables (r.v.)

Course overview Processamento de sinais 2009/10 LEA

Introduction to Time Series Analysis and Forecasting. 2nd Edition. Wiley Series in Probability and Statistics

How To Understand The Theory Of Probability

Component Ordering in Independent Component Analysis Based on Data Power

Course Curriculum for Master Degree in Electrical Engineering/Wireless Communications

CS Introduction to Data Mining Instructor: Abdullah Mueen

Network Security A Decision and Game-Theoretic Approach

11. Time series and dynamic linear models

4F7 Adaptive Filters (and Spectrum Estimation) Least Mean Square (LMS) Algorithm Sumeetpal Singh Engineering Department sss40@eng.cam.ac.

Point Lattices in Computer Graphics and Visualization how signal processing may help computer graphics

Software and Hardware Solutions for Accurate Data and Profitable Operations. Miguel J. Donald J. Chmielewski Contributor. DuyQuang Nguyen Tanth

DIGITAL IMAGE PROCESSING AND ANALYSIS

Analysis of Mean-Square Error and Transient Speed of the LMS Adaptive Algorithm

Dimension Reduction. Wei-Ta Chu 2014/10/22. Multimedia Content Analysis, CSIE, CCU

Software Review: ITSM 2000 Professional Version 6.0.

4 Digital Video Signal According to ITU-BT.R.601 (CCIR 601) 43

FUNDAMENTALS of INFORMATION THEORY and CODING DESIGN

RADIO FREQUENCY INTERFERENCE AND CAPACITY REDUCTION IN DSL

MPEG Unified Speech and Audio Coding Enabling Efficient Coding of both Speech and Music

Sampling Theorem Notes. Recall: That a time sampled signal is like taking a snap shot or picture of signal periodically.

Revision of Lecture Eighteen

Adaptive Demand-Forecasting Approach based on Principal Components Time-series an application of data-mining technique to detection of market movement

QUALITY ENGINEERING PROGRAM

Figure1. Acoustic feedback in packet based video conferencing system

Lab 1. The Fourier Transform

Voice---is analog in character and moves in the form of waves. 3-important wave-characteristics:

Bildverarbeitung und Mustererkennung Image Processing and Pattern Recognition

Michael Hiebel. Fundamentals of Vector Network Analysis

THE problem of tracking multiple moving targets arises

Probability and Statistics

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION

Developing an Isolated Word Recognition System in MATLAB

Stock Option Pricing Using Bayes Filters

NRZ Bandwidth - HF Cutoff vs. SNR

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Relations Between Time Domain and Frequency Domain Prediction Error Methods - Tomas McKelvey

Principles of Data Mining by Hand&Mannila&Smyth

Background 2. Lecture 2 1. The Least Mean Square (LMS) algorithm 4. The Least Mean Square (LMS) algorithm 3. br(n) = u(n)u H (n) bp(n) = u(n)d (n)

Factor Graphs and the Sum-Product Algorithm

Hidden Markov Models Chapter 15

Design of all-pass operators for mixed phase deconvolution

SPEAKER IDENTIFICATION FROM YOUTUBE OBTAINED DATA

PHASE ESTIMATION ALGORITHM FOR FREQUENCY HOPPED BINARY PSK AND DPSK WAVEFORMS WITH SMALL NUMBER OF REFERENCE SYMBOLS

Time series analysis Matlab tutorial. Joachim Gross

Digital Image Processing

Practical Tour of Visual tracking. David Fleet and Allan Jepson January, 2006

Time Series Analysis

> plot(exp.btgpllm, main = "treed GP LLM,", proj = c(1)) > plot(exp.btgpllm, main = "treed GP LLM,", proj = c(2)) quantile diff (error)

Nonlinear Iterative Partial Least Squares Method

Statistical Rules of Thumb

Performance of Quasi-Constant Envelope Phase Modulation through Nonlinear Radio Channels

BSEE Degree Plan Bachelor of Science in Electrical Engineering:

Machine Learning and Data Analysis overview. Department of Cybernetics, Czech Technical University in Prague.

Summary Nonstationary Time Series Multitude of Representations Possibilities from Applied Computational Harmonic Analysis Tests of Stationarity

Numerical Methods for Engineers

Signal Detection. Outline. Detection Theory. Example Applications of Detection Theory

Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics

The Filtered-x LMS Algorithm

Numerical Methods For Image Restoration

Transcription:

Advanced Signal Processing and Digital Noise Reduction Saeed V. Vaseghi Queen's University of Belfast UK WILEY HTEUBNER A Partnership between John Wiley & Sons and B. G. Teubner Publishers Chichester New York Brisbane Toronto Singapore Stuttgart Leipzig

Preface xvi 1 Introduction 1 1.1 Signals and Information 2 1.2 Signal Processing Methods 3 1.2.1 Non-parametric Signal Processing 3 1.2.2 Model-based Signal Processing 3 1.2.3 Bayesian Statistical Signal Processing 4 1.2.4 Neural Networks 4 1.3 Applications of Digital Signal Processing 4 1.3.1 Adaptive Noise Cancellation and Noise Reduction 5 1.3.2 Blind Channel Equalisation 7 1.3.3 Signal Classification and Pattern Recognition 8 1.3.4 Linear Prediction Modelling of Speech 9 1.3.5 Digital Coding of Audio Signals 11 1.3.6 Detection of Signals in Noise 13 1.3.7 Directional Reception of Waves: Beamforming 14 1.4 Sampling and Analog to Digital Conversion 16 1.4.1 Time-Domain Sampling and Reconstruction of Analog Signals... 17 1.4.2 Quantisation 20 Bibliography 21 2 Stochastic Processes 23 2.1 Random Signals and Stochastic Processes 24 2.1.1 Stochastic Processes 25 2.1.2 The Space or Ensemble of a Random Process 26 2.2 Probabilistic Models of a Random Process 27 2.3 Stationary and Nonstationary Random Processes 31 2.3.1 Strict Sense Stationary Processes 33 2.3.2 Wide Sense Stationary Processes 34 2.3.3 Nonstationary Processes 34 2.4 Expected Values of a Stochastic Process 35 2.4.1 The Mean Value 35 2.4.2 Autocorrelation 36 2.4.3 Autocovariance 37 2.4.4 Power Spectral Density 38 2.4.5 Joint Statistical Averages of Two Random Processes 40 2.4.6 Cross Correlation and Cross Covariance 40 2.4.7 Cross Power Spectral Density and Coherence 42 2.4.8 Ergodic Processes and Time-averaged Statistics 42

viii Contents 2.4.9 Mean-ergodic Processes 43 2.4.10 Correlation-ergodic Processes 44 2.5 Some Useful Classes of Random Processes 45 2.5.1 Gaussian (Normal) Process 45 2.5.2 Multi-variate Gaussian Process 47 2.5.3 Mixture Gaussian Process 48 2.5.4 A Binary-state Gaussian Process 49 2.5.5 Poisson Process 50 2.5.6 Shot Noise 52 2.5.7 Poisson-Gaussian Model for Clutters and Impulsive Noise 53 2.5.8 Markov Processes 54 2.6 Transformation of a Random Process 57 2.6.1 Monotonic Transformation of Random Signals 58 2.6.2 Many-to-one Mapping of Random Signals 60 Summary 62 Bibliography 63 3 Bayesian Estimation and Classification 65 3.1 Estimation Theory: Basic Definitions 66 3.1.1 Predictive and Statistical Models in Estimation 66 3.1.2 Parameter Space 67 3.1.3 Parameter Estimation and Signal Restoration 68 3.1.4 Performance Measures 69 3.1.5 Prior, and Posterior Spaces and Distributions 71 3.2 Bayesian Estimation 74 3.2.1 Maximum a Posterior Estimation 75 3.2.2 Maximum Likelihood Estimation 76 3.2.3 Minimum Mean Squared Error Estimation 79 3.2.4 Minimum Mean Absolute Value of Error Estimation 81 3.2.5 Equivalence of MAP, ML, MMSE and MAVE 82 3.2.6 Influence of the Prior on Estimation Bias and Variance 82 3.2.7 The Relative Importance of the Prior and the Observation 86 3.3 Estimate-Maximise (EM) Method 90 3.3.1 Convergence of the EM algorithm 91 3.4 Cramer-Rao Bound on the Minimum Estimator Variance 93 3.4.1 Cramer-Rao Bound for Random Parameters 95 3.4.2 Cramer-Rao Bound for a Vector Parameter 95 3.5 Bayesian Classification 96 3.5.1 Classification of Discrete-valued Parameters 96 3.5.2 Maximum a Posterior Classification 98 3.5.3 Maximum Likelihood Classification 98 3.5.4 Minimum Mean Squared Error Classification 99 3.5.5 Bayesian Classification of Finite State Processes 99 3.5.6 Bayesian Estimation of the Most Likely State Sequence 101 3.6 Modelling the Space of a Random Signal 102 3.6.1 Vector Quantisation of a Random Process 103 3.6.2 Design of a Vector Quantiser: K-Means Algorithm 103

' x 3.6.3 Design of a Mixture Gaussian Model 104 3.6.4 The EM Algorithm for Estimation of Mixture Gaussian Densities 105 Summary 108 Bibliography 109 4 Hidden Markov Models ill 4.1 Statistical Models for Nonstationary Processes 112 4.2 Hidden Markov Models 114 4.2.1 A Physical Interpretation of Hidden Markov Models 115 4.2.2 Hidden Markov Model As a Bayesian Method 116 4.2.3 Parameters of a Hidden Markov Model 117 4.2.4 State Observation Models 118 4.2.5 State Transition Probabilities 119 4.2.6 State-Time Trellis Diagram 120 4.3 Training Hidden Markov Models 121 4.3.1 Forward-Backward Probability Computation 122 4.3.2 Baum-Welch Model Re-Estimation 124 4.3.3 Training Discrete Observation Density HMMs 125 4.3.4 HMMs with Continuous Observation PDFs 127 4.3.5 HMMs with Mixture Gaussian pdfs 128 4.4 Decoding of Signals Using Hidden Markov Models 129 4.4.1 Viterbi Decoding Algorithm 131 4.5 HMM-based Estimation of Signals in Noise 133 4.5.1 HMM-based Wiener Filters 135 4.5.2 Modelling Noise Characteristics 136 Summary 137 Bibliography 138 5 Wiener Filters 140 5.1 Wiener Filters: Least Squared Error Estimation 141 5.2 Block-data Formulation of the Wiener Filter 145 5.3 Vector Space Interpretation of Wiener Filters 148 5.4 Analysis of the Least Mean Squared Error Signal 150 5.5 Formulation of Wiener Filter in Frequency Domain 151 5.6 Some Applications of Wiener Filters 152 5.6.1 Wiener filter for Additive Noise Reduction 153 5.6.2 Wiener Filter and Separability of Signal and Noise 155 5.6.3 Squared Root Wiener Filter 156 5.6.4 Wiener Channel Equaliser 157 5.6.5 Time-alignment of Signals 158 5.6.6 Implementation of Wiener Filters 159 Summary 161 Bibliography 162 6 Kalman and Adaptive Least Squared Error Filters 164 6.1 State-space Kalman Filters 165 6.2 Sample Adaptive Filters 171 6.3 Recursive Least Squares (RLS) Adaptive Filters 172

6.4 The Steepest Descent Method 177 6.5 The LMS Adaptation Method 181 Summary 182 Bibliography 183 7 Linear Prediction Models 185 7.1 Linear Prediction Coding 186 7.1.1 Least Mean Squared Error Predictor 189 7.1.2 The Inverse Filter: Spectral Whitening 191 7.1.3 The Prediction Error Signal 193 7.2 Forward, Backward and Lattice Predictors 193 7.2.1 Augmented Equations for Forward and Backward Predictors 195 7.2.2 Levinson-Durbin Recursive Solution 196 7.2.3 Lattice Predictors 198 7.2.4 Alternative Formulations of Least Squared Error Predictors 200 7.2.5 Model Order Selection 201 7.3 Short-term and Long-term Predictors 202 7.4 MAP Estimation of Predictor Coefficients 204 7.5 Signal Restoration Using Linear Prediction Models 207 7.5.1 Frequency Domain Signal Restoration 209 Summary 212 Bibliography 212 8 Power Spectrum Estimation 214 8.1 Fourier Transform, Power Spectrum and Correlation 215 8.1.1 Fourier Transform 215 8.1.2 Discrete Fourier Transform (DFT) 217 8.1.3 Frequency Resolution and Spectral Smoothing 217 8.1.4 Energy Spectral Density and Power Spectral Density 218 8.2 Non-parametric Power Spectrum Estimation 220 8.2.1 The Mean and Variance of Periodograms 221 8.2.2 Averaging Periodograms (Bartlett Method) 221 8.2.3 Welch Method :Averaging Periodograms from Overlapped and Windowed Segments 222 8.2.4 Blackman-Tukey Method 224 8.2.5 Power Spectrum Estimation from Autocorrelation of Overlapped Segments 225 8.3 Model-based Power Spectrum Estimation 225 8.3.1 Maximum Entropy Spectral Estimation 227 8.3.2 Autoregressive Power Spectrum Estimation 229 8.3.3 Moving Average Power Spectral Estimation 230 8.3.4 Autoregressive Moving Average Power Spectral Estimation 231 8.4 High Resolution Spectral Estimation Based on Subspace Eigen Analysis.. 232 8.4.1 Pisarenko Harmonic Decomposition 232 8.4.2 Multiple Signal Classification (MUSIC) Spectral Estimation 235 8.4.3 Estimation of Signal Parameters via Rotational Invariance

xi Techniques (ESPRIT) 238 Summary 240 Bibliography 240 9 Spectral Subtraction 242 9.1 Spectral Subtraction 243 9.1.1 Power Spectrum Subtraction 246 9.1.2 Magnitude Spectrum Subtraction 247 9.1.3 Spectral Subtraction Filter: Relation to Wiener Filters 247 9.2 Processing Distortions 248 9.2.1 Effect of Spectral Subtraction on Signal Distribution 250 9.2.2 Reducing the Noise Variance 251 9.2.3 Filtering Out the Processing Distortions 251 9.3 Non-linear Spectral Subtraction 252 9.4 Implementation of Spectral Subtraction 255 9.4.1 Application to Speech Restoration and Recognition 257 Summary 259 Bibliography 259 10 Interpolation 261 10.1 Introduction 262 10.1.1 Interpolation of a Sampled Signal 262 10.1.2 Digital Interpolation by a Factor of / 263 10.1.3 Interpolation of a Sequence of Lost Samples 265 10.1.4 Factors that Affect Interpolation 267 10.2 Polynomial Interpolation 268 10.2.1 Lagrange Polynomial Interpolation 269 10.2.2 Newton Interpolation Polynomial 270 10.2.3 Hermite Interpolation Polynomials 273 10.2.4 Cubic Spline Interpolation 273 10.3 Statistical Interpolation 276 10.3.1 Maximum a Posterior Interpolation 277 10.3.2 Least Squared Error Autoregressive Interpolation 279 10.3.3 Interpolation Based on a Short-term Prediction Model 279 10.3.4 Interpolation Based on Long-term and Short-term Correlations 282 10.3.5 LSAR Interpolation Error 285 10.3.6 Interpolation in Frequency-Time Domain 287 10.3.7 Interpolation using Adaptive Code Books 289 10.3.8 Interpolation Through Signal Substitution 289 Summary 291 Bibliography 292 11 Impulsive Noise 294 11.1 Impulsive Noise 295 11.1.1 Autocorrelation and Power Spectrum of Impulsive Noise 297 11.2 Stochastic Models for Impulsive Noise 298 11.2.1 Bernoulli-Gaussian Model of Impulsive Noise 299 11.2.2 Poisson-Gaussian Model of Impulsive Noise 299

X 'i Contents 11.2.3 A Binary State Model of Impulsive Noise 300 11.2.4 Signal to Impulsive Noise Ratio 302 11.3 Median Filters 302 11.4 Impulsive Noise Removal Using Linear Prediction Models 304 11.4.1 Impulsive Noise Detection 304 11.4.2 Analysis of Improvement in Noise Detectability 306 11.4.3 Two-sided Predictor 308 11.4.4 Interpolation of Discarded Samples 308 11.5 Robust Parameter Estimation 309 11.6 Restoration of Archived Gramophone Records 311 Summary 312 Bibliography 312 12 Transient Noise 314 12.1 Transient Noise Waveforms 315 12.2 Transient Noise Pulse Models 316 12.2.1 Noise Pulse Templates 317 12.2.2 Autoregressive Model of Transient Noise 317 12.2.3 Hidden Markov Model of a Noise Pulse Process 318 12.3 Detection of Noise Pulses 319 12.3.1 Matched Filter 320 12.3.2 Noise Detection Based on Inverse Filtering 321 12.3.3 Noise Detection Based on HMM 322 12.4 Removal of Noise Pulse Distortions 323 12.4.1 Adaptive Subtraction of Noise pulses 323 12.4.2 AR-based Restoration of Signals Distorted by Noise Pulses 324 Summary 327 Bibliography 327 13 Echo Cancellation 328 13.1 Telephone Line Echoes 329 13.1.1 Telephone Line Echo Suppression 330 13.2 Adaptive Echo Cancellation 331 13.2.1 Convergence of Line Echo Canceller 333 13.2.2 Echo Cancellation for Digital Data Transmission over Subscriber's Loop 334 13.3 Acoustic Feedback Coupling 335 13.4 Sub-band Acoustic Echo Cancellation 339 Summary 341 Bibliography 341 14 Blind Deconvolution and Channel Equalisation 343 14.1 Introduction 344 14.1.1 The Ideal Inverse Channel Filter 345 14.1.2 Equalisation Error, Convolutional Noise 346 14.1.3 Blind Equalisation 347 14.1.4 Minimum and Maximum Phase Channels 349

xiii 14.1.5 Wiener Equaliser 350 14.2 Blind Equalisation Using Channel Input Power Spectrum 352 14.2.1 Homomorphic Equalisation 354 14.2.2 Homomorphic Equalisation using a Bank of High Pass Filters.. 356 14.3 Equalisation Based on Linear Prediction Models 356 14.3.1 Blind Equalisation Through Model Factorisation 358 14.4 Bayesian Blind Deconvolution and Equalisation 360 14.4.1 Conditional Mean Channel Estimation 360 14.4.2 Maximum Likelihood Channel Estimation 361 14.4.3 Maximum a Posterior Channel Estimation 361 14.4.4 Channel Equalisation Based on Hidden Markov Models 362 14.4.5 MAP Channel Estimate Based on HMMs 365 14.4.6 Implementations of HMM-Based Deconvolution 366 14.5 Blind Equalisation for Digital Communication Channels 369 14.6 Equalisation Based on Higher-Order Statistics 375 14.6.1 Higher-Order Moments 376 14.6.2 Higher Order Spectra of Linear Time-Invariant Systems 379 14.6.3 Blind Equalisation Based on Higher Order Cepstrum 379 Summary 385 Bibliography 385 Frequently used Symbols and Abbreviations 388 Index 391