Time Series Analysis by Higher Order Crossings. Benjamin Kedem Department of Mathematics & ISR University of Maryland College Park, MD

Size: px
Start display at page:

Download "Time Series Analysis by Higher Order Crossings. Benjamin Kedem Department of Mathematics & ISR University of Maryland College Park, MD"

Transcription

1 Time Series Analysis by Higher Order Crossings Benjamin Kedem Department of Mathematics & ISR University of Maryland College Park, MD 1

2 Stochastic Process A stochastic or random process {Z t },, 1,0,1,, is a collection of random variables, real or complex-valued, defined on the same probability space. Gaussian Process: A real-valued process {Z t }, t T, is called Gaussian process if for all t 1, t 2,, t n T, the joint distribution of (Z t1, Z t2,, Z tn ) is multivariate normal. The finite dimensional distributions of a Gaussian process are completely determined from: and m(t) = E[Z t ] R(s, t) = Cov[Z s, Z t ]. Markov Process: For t 1 < < t n 2 < t n 1 < t n P(Z tn z Z tn 1, Z tn 2,...Z t1 ) = P(Z tn z Z tn 1 ) 2

3 Stationary Processes A stochastic process {Z t } is said to be a strictly stationary process if its joint distributions are invariant under time shifts: (Z t1, Z t2,, Z tn ) Dist = (Z t1 +τ, Z t2 +τ,, Z tn +τ) for all t 1, t 2,, t n, n, and τ. When 2nd order moments exist, strict stationarity implies: E[Z t ] = E[Z t+τ ] = E[Z 0 ] = m (1) Cov[Z t, Z s ] = R(t s). (2) {Z t } is called weakly stationary when (1),(2) hold. For simplicity, we shall assume all our processes are both strictly and weakly stationary and also real-valued. 3

4 Assume: E(Z t ) = 0. Autocovariance: Autocorrelation: R k = E(Z t Z t k ) = ρ k = R k R 0 = Spectral Distribution: F(λ), π π π π cos(kλ)df(λ) (3) cos(kλ)df(λ) π λ π When F is absolutely continuous, we have the Spectral Density: f(λ) = F (λ), In general in practice: π λ π F(λ) = F c (λ) + F d (λ) where F c (λ) is absolutely continuous and F d (λ) is a step function, both monotone nondecreasing. R k, ρ k, f(λ) are symmetric. 4

5 spectral representation A. Kolmogorov and H. Cramér in the early 1940 s. Let {Z t }, t = 0, ±1, ±2,, be a zero mean weakly stationary process. Then (3) implies Z t = π π e itλ dξ(λ), t = 0, ±1, (4) where now the spectral distribution satisfies E[dξ(λ)dξ(ω)] = { df(λ), if λ = ω 0, if λ ω (5) We may interpret df(λ) = E dξ(λ) 2 as the weight or power given to frequency λ. 5

6 Example: Sum of Random Sinusoids. Z t = p {A j cos(ω j t) + B j sin(ω j t)}, t = 0, ±1, (6) j=1 A 1,, A p, B 1,, B p uncorrelated. E[A j ] = E[B j ] = 0, V ar[a j ] = V ar[b j ] = σ 2 j, ω j (0, π), for all j. Then for all t, E[Z t ] = 0, and R k = E[Z t Z t k ] = p σj 2 cos(ω j k) = j=1 p { 1 2 σ2 j cos(ω j k) σ2 j cos( ω j k)}, k = 0, ±1, j=1 ρ k = R k R 0 = p j=1 σ2 j cos(ω jk) p, k = 0, ±1, (7) j=1 σ2 j Discrete spectrum: F(ω) is a nondecreasing step function with jumps of size 1 2 σ2 j at ±ω j, and F( π) = 0, F(π) = R 0 = p j=1 σ2 j. 6

7 Example: Stationary AR(1) Process. Let {ǫ t }, t = 0, ±1, ±2,, be a sequence of uncorrelated real-valued random variables with mean zero and variance σ 2 ǫ. Define Z t = φ 1 Z t 1 + ǫ t, t = 0, ±1, ±2, (8) where φ 1 < 1. The process ( 8) is called a first order autoregressive process and is commonly denoted by AR(1). E[Z t ] = E lim n n φ j 1 ǫ t j j=0 = lim n E n φ j 1 ǫ t j = 0 j=0 Continuous spectrum: f(λ) = σ2 ǫ 2π R k = σ2 ǫ φ k 1 1 φ 2, k = 0, ±1, ±2, 1 ρ k = φ k 1, k = 0, ±1, ±2, 1 1 2φ 1 cos(λ) + φ 2, π λ π. 1 7

8 Example: Sum of Random Sinusoids Plus Noise. Let {Z t } be a signal as in (6), and let {ǫ t } be a zero mean weakly stationary noise uncorrelated with {Z t } and with a spectrum which possesses a spectral density f ǫ (ω), F ǫ (ω) = ω π f ǫ (λ)dλ Then the process p Y t = {A j cos(ω j t) + B j sin(ω j t)} + ǫ t, t = 0, ±1, (9) j=1 has a mixed spectrum of the form: F y (ω) = F z (ω) + F ǫ (ω) = λ j ω 1 ω 2 σ2 j + f ǫ (λ)dλ π λ j { ω p, ω p 1,..., ω p 1, ω p } 8

9 (a) Sum of two sinusoids. (b) Sum of two sinusoids plus white noise. A=0.5, B=1, w1=0.513, w2=0.771, CN=0, SNR=Inf A=0.5, B=1, w1=0.513, w2=0.771, CN=2.2, SNR=

10 Plots of AR(1) time series and their estimated autocorrelation. PHI=0.8 AR(1) and Estimated ACF Series : x ACF Lag PHI=-0.8 Series : x ACF Lag WN: PHI=0.0 Series : wn ACF Lag 10

11 Plots of AR(1) time series and their spectral densities on [0, π]. PHI=0.8 AR(1): Spectral Density PHI= f.ar1(1, 0.8) PHI= lambda PHI= f.ar1(1, -0.8) lambda WN: PHI=0.0 PHI= f.ar1(1, 0) lambda 11

12 Linear Filtering By a time invariant linear filter applied to a stationary time series {Z t }, t = 0, ±1,..., we mean the linear operation or convolution, Y t = L({Z t }) = h j Z t j (10) with H(λ) j= j= h j e ijλ, 0 < λ π The function H(λ) is called the transfer function. H(λ) is called the gain. Fact: df y (λ) = H(λ) 2 df z (λ) (11) In particular, when spectral densities exist we have f y (λ) = H(λ) 2 f z (λ) (12) This is an important relationship between the input and output spectral densities. 12

13 The Difference Operator: Z t Z t Z t 1 This is a linear filter with h 0 = 1, h 1 = 1, and h j = 0 otherwise. The transfer function is, and the squared gain is H(λ) = 1 e iλ H(λ) 2 = 1 e iλ 2 = 2(1 cos λ) In [0, π] the gain is monotone increasing and hence this is a high-pass filter. The squared gain of the second difference 2 is 4(1 cos λ) 2, and hence this is a more pronounced high-pass filter. Repeated differencing is a simple way to obtain highpass filters. 13

14 Differencing white noise: Higher frequencies get more power. WN Differencing WN WN SPEC DENSITY f.ar1(1, 0) lambda Diff(WN) Diff(WN) SPEC DENSITY * (1 - cos(lambda)) * f.ar1(1, 0) lambda Diff^5(WN) Diff^5(WN) SPEC DENSITY (2 * (1 - cos(lambda)))^5 * f.ar1(1, 0) lambda 14

15 We define a parametric filter by the convolution, Z t (θ) L θ (Z) t = n h n (θ)z t n (13) In other words where denotes convolution. Z t (θ) h t (θ) Z t (14) 15

16 The Parametric AR(1) Filter. Let α < 1. The AR(1) (or α) filter is the recursive filter Y t = αy t 1 + Z t or Y t = L α (Z) t = Z t + αz t 1 + α 2 Z t 2 + The transfer function for ω [0, π] is The squared gain is H(λ) = 1 1 αe iλ H(ω; α) 2 = 1 1 2αcos(ω) + α2, α ( 1,1). (15) 16

17 For α > 0 the AR(1) filter is a low-pass filter, and a high-pass for α < 0. Squared Gain of the AR(1) Filter a=0.5 a= omega 17

18 The Parametric AR(2) Filter. With α ( 1,1), define Y t (α) by operating on Z t, Y t (α) = (1 + η 2 )αy t 1 (α) η 2 Y t 2 (α) + Z t (16) where η (0,1) is the bandwidth parameter. Squared gains of the AR(2) filter centered approximately at cos 1 (α) for η close to 1. Squared Gain of the AR(2) Filter (alpha,eta)= (0.7,0.92) (0.6,0.94) (0.5,0.96) omega 18

19 Zero-crossings in Discrete Time Let Z 1, Z 2,, Z N be a zero-mean stationary time series. The zero-crossing count in discrete time is defined as the number of symbol changes in the corresponding clipped binary time series. First define the clipped binary time series: X t = { 1, if Zt 0 0, if Z t < 0 The number of zero-crossings, denoted by D, is defined in terms of {X t }, D = N [X t X t 1 ] 2 (17) t=2 0 D N 1 Example: Z: X: N = 10, D = 5. 19

20 ( ) Cosine Formula {Z t } a stationary Gaussian process. There is an explicit formula connecting ρ 1 and E[D], ( ) πe[d] ( ) ρ 1 = cos (18) N 1 An inverse relationship: E(D) 0 ρ 1 1 E(D) N 1 ρ

21 ( ) zero-crossing spectral representation ( ) πe[d] cos = N 1 π π cos(ω)df(ω) π π df(ω) (19) Assume F is continuous at the origin, ( ) π πe[d] 0 cos = cos(ω)df(ω) π N 1 0 df(ω) (20) ( ) Dominant Frequency Principle: For ω 0 (0, π) then F(ω+) F(ω ) > 0, ω = ω 0 = 0, ω ω 0 ( ) πe[d] cos = cos(ω 0 ) N 1 or, by the monotonicity of cos(x) in [0, π], πe[d] N 1 = ω 0 21

22 ( ) Higher Order Crossings (HOC) {Z t }, t = 0, ±1, ±2, {L θ ( ), θ Θ} 3. L θ (Z) 1, L θ (Z) 2,, L θ (Z) N 4. X t (θ) = { 1, if Lθ (Z) t 0 0, if L θ (Z) t < 0 5. HOC {D θ, θ Θ}: D θ = N [X t (θ) X t 1 (θ)] 2 t=2 HOC Combines ZC counts and linear operations (filters.) 22

23 Connection Between ZC and Filtering H(ω; θ) the transfer function corresponding to L θ ( ), and assume {Z t } is a zero-mean stationary Gaussian process. ( ) π πe[dθ ] π ρ 1 (θ) cos = cos(ω) H(ω; θ) 2 df(ω) π N 1 π H(ω; (21) θ) 2 df(ω) Assuming F is continuous at 0, ( ) π πe[dθ ] 0 ρ 1 (θ) cos = cos(ω) H(ω; θ) 2 df(ω) π N 1 0 H(ω; θ) 2 df(ω) (22) The representation ( 21) and ( 22) help to understand the effect of filtering on zero-crossings through the spectrum even in the general non-gaussian case. HOC now refers to both ρ 1 (θ) and D θ. Note: ρ 1 (θ) can be defined directly and in general. There is no need for the Gaussian assumption. π π ρ 1 (θ) cos(ω) H(ω; θ) 2 df(ω) π π H(ω; (23) θ) 2 df(ω) 23

24 HOC From Differences and define Z t Z t Z t 1 L j j 1, j {1,2,3, } with L 1 0 being the identity filter. The corresponding HOC are called the simple HOC. Thus, D 1 from ZC of Z t D 2 from ZC of ( Z) t D 3 from ZC of ( 2 Z) t D 4 from ZC of ( 3 Z) t D 1, D 2, D 3, 24

25 Properties of Simple HOC: (a) Monotonicity: D j 1 D j+1 which implies under strict stationarity, 0 E[D 1 ] E[D 2 ] E[D 3 ] N 1 Example: E[D k ] of Gaussian White Noise. N = 1000: E[D k ] = (N 1){ π sin 1 ( k 1 k )} k E[D k ]

26 Problem: Thus, {E[D j ]} is a monotone bounded sequence. What does it converge to as j??? Problem: What happens when E[D 1 ] = E[D 2 ]??? Problem: As j, {X t (j)}??? Problem: As N, D 1 N Constant??? 26

27 (b) K (1984): If {Z t } is Gaussian, then with prob. 1 E[D 1 ] N 1 = E[D 2] N 1 Z t = Acos(ω 0 t + ϕ) ω 0 = πe[d 1 ]/(N 1). (c) Suppose {Z t } is Gaussian, and let ω be the highest positive frequency in the spectral support ω [0, ω ]. Then, regardless of spectrum type, πe[d j ] N 1 ω, j (24) (d) Suppose {Z t } is Gaussian. D 1 Constant??? N K Slud (1994): ( ) Yes in the continuous spectrum case. ( ) Sometime if the spectrum contains 1 jump. ( ) No if the spectrum contains 2 or more jumps. 27

28 (e) K Slud (1982): Higher Order Crossings Theorem {Z t }, t = 0, ±1,, be a zero-mean stationary process, and assume that π is included in the spectral support. Define { 1, if X t (j) = j 1 Z t 0 0, if j 1 Z t < 0 Then, (i) as j. {X t (j)} { , wp 1/ , wp 1/2 (ii) lim j lim N D j N 1 = 1, wp 1. 28

29 Demonstration of the HOC Theorem using AR(1) with parameter φ = 0.0, 0.8. N = inferences. j X t (j) D j X t (j) D j φ = 0.8 φ = 0.0 (WN) The state is approached quite fast, and D 1 < D 2 < D 3 < < D 16 Only the first few D k s are useful in discrimination between processes. 29

30 (f) For a zero-mean stationary Gaussian process, the sequence of expected simple HOC {E[D k ]} determines the spectrum up to a constant: {E[D k ]} {ρ k } F(ω) (25) (g) K (1980): Let {Z t } be a zero mean stationary Gaussian process with acf ρ j. If j= ρ j <, then and where j= κ x (1, j,1 j) < D 1 E[D 1 ] N L N(0, σ 2 1 ), N 1 π 2 j= σ 2 1 = { (sin 1 ρ j ) 2 + sin 1 ρ j 1 sin 1 ρ j+1 + 4π 2 κ x (1, j,1 j) } 30

31 (h) Slud (1991): Let {Z t }, t = 0, ±1,, be a zero mean stationary Gaussian process with acf ρ j, V ar[z 0 ] = 1, and square integrable spectral density f. Then where σ 2 1 satisfies, D 1 E[D 1 ] N L N(0, σ 2 1 ), N σ π(1 ρ 2 1 ) π π Proof: Use Itô Wiener calculus. ρ 1 cos(ω) 2 f 2 (ω)dω > 0 (i) Problem: Given two stationary processes with the same spectrum of which one is Gaussian and the other is not. Is it true that the Gaussian process has a higher expected ZCR??? Answer: Assume continuous time. {Z(t)}, stationary Gaussian process, Z(t) N(0,1), < t <. Consider the interval [0,1]. 31

32 Divide [0,1] into N 1 intervals of size. Define the sampled time series, Then Z k Z((k 1) ), k = 1,2,, N ρ 1 = ρ( ) (26) From the cosine formula we obtain: Rice (1944) formula 1 ( ) E[D c ] lim E[D 1 ] = lim N 0 π cos 1 (ρ( )) = 1 π ρ (0) (27) Barnett K (1998): There are non-gaussian processes such that with κ < 1 and κ > 1. ( ) E[D c ] = κ π ρ (0) If Z 1 (t), Z 2 (t) are independent copies of Z(t), then the product Z 1 (t)z 2 (t) has κ = 2. For Z 3 (t), κ = 5/9. 32

33 Application: Discrimination by Simple HOC The ψ 2 Statistic: When N is sufficiently large (e.g. N 200), then with a high probability 0 < D 1 < D 2 < D 3 < < (N 1) To capture the rate of increase in the first few D k, consider the increments D 1, if k = 1 k D k D k 1, if k = 2,, K 1 (N 1) D K 1, if k = K Then K k = N 1 k=1 Let m k = E[ k ]. We define a general similarity measure ψ 2 K k=1 ( k m k ) 2 m k (28) When k, m k are from the same process, and K = 9: P(ψ 2 > 30) <

34 Application: Frequency Estimation Recall the AR(1) filter (α-filter), Z t (α) = L α (Z) t = Z t + αz t 1 + α 2 Z t 2 + with squared gain H(ω; α) 2 = 1 1 2αcos(ω) + α2, α ( 1,1), ω [0, π]. Consider: Z t = A 1 cos(ω 1 t) + B 1 sin(ω 1 t) + ζ t, t = 0, ±1, ω 1 (0, π). A 1, B 1 are uncorrelated N(0, σ 2 1 ). {ζ t } N(0, σ 2 ζ ) white noise independent of A 1, B 1. Define: Then for α ( 1,1), C(α) = V ar(ζ t(α)) V ar(z t (α)). (29) 0 < C(α) < 1. 34

35 He K (1989) Algorithm Let {D α } be the HOC from the AR(1) filter. Fix α 1 ( 1,1). Define ( ) πe[dαk ] ( ) α k+1 = cos, k = 1,2, (30) N 1 Then, as k, and α k cos(ω 1 ) πe[d αk ] N 1 ω 1 (31) Proof: Note the fundamental property of the AR(1) filter gain: π π ( ) α = ρ 1,ζ (α) = cos(ω) H(ω; α) 2 dω π π H(ω; (32) α) 2 dω 35

36 We have ρ 1 (α) = cos ( ) πe[dα ] N 1 = σ2 1 H(ω 1; α) 2 cos(ω 1 ) + π π H(ω; α) 2 df ζ (ω) α σ 2 1 H(ω 1; α) 2 + π π H(ω; α) 2 df ζ (ω) A weighted average of α cos(ω 1 ) and α (!) Thus, we have a contraction mapping ( ) ρ 1 (α) = α + C(α)(α α ) (33) and the recursion ( 30) becomes, with fixed point α : or ( ) α k+1 = ρ 1 (α k ) (34) α = ρ 1 (α ) cos(ω 1 ) = cos ( ) πe[dα ] N 1 By the monotonicity of cos(x), x [0, π], ω 1 = πe[d α ] N 1 36

37 Extension: Contractions From Bandpass Filters (CM) Yakowitz (1991): We do not need the Gaussian assumption, and can speed up the contraction convergence. 1. Parametric family of band-pass filters indexed by r ( 1,1), and by a bandwidth parameter M: {L r,m ( ), r ( 1,1), M = 1,2, } 2. Let h(n; r, M) and H(ω; r, M), be the corresponding complex impulse response and transfer function, respectively. 3. It is required that as M, H(ω; r, M) 2 converges to a Dirac delta function centered at θ(r) cos 1 (r). 4. Assume further that the filter passes only the (positive) discrete frequency to be detected; suppose it is ω 1. Also, observe that { } π E[ζ t (r, M)ζ t 1 (r, M)] π R = cos(ω) H(ω; r, M) 2 df ζ (ω) E ζ t (r, M) 2 π π H(ω; r, M) 2 df ζ (ω) where the overbar denotes complex conjugate. 37

38 5. Suppose that for any M the fundamental property takes the form { } E[ζ t (r, M)ζ t 1 (r, M)] r = R (35) E ζ t (r, M) 2 6. Define, { } E[Z t (r, M)Z t 1 (r, M)] ρ 1 (r, M) R E Z t (r, M) 2 (36) 7. Clearly, ρ 1 (r, M) = 1 2 σ2 1 H(ω 1; r, M) 2 cos(ω 1 ) + π π H(ω; r, M) 2 df ζ (ω) r 1 2 σ2 1 H(ω 1; r, M) 2 + π π H(ω; r, M) 2 df ζ (ω) 8. Let C(r, M) = E ζ t(r, M) 2 E Z t (r, M) 2 9. The contraction has now an extra parameter M: ρ 1 (r, M) = r + C(r, M)(r r ) (37) where r = cos(ω 1 ), and ω 1 is the true frequency. 10. The CM algorithm takes now the form r k+1 = ρ 1 (r k, M k ) (38) 38

39 CM With the Parametric AR(2) Filter in Practice With α ( 1,1), η (0,1), Y t (α) = (1 + η 2 )αy t 1 (α) η 2 Y t 2 (α) + Y t (39) Initial guess of ω 1 : θ 0. Start with α 0 = cos(θ 0 ). Start with η close to 1, for example η = Increment of η: e.g Define the sample autocorrelation ˆρ 1 (α, η) = N 1 t=1 Y t(α)y t 1 (α) N 1 t=0 Y t 2(α) (40) Then the CM algorithm is given by, ( ) α k+1 = ˆρ 1 (α k, η k ), k = 0,1,2, (41) where η k increases with each iteration. 39

40 Li (1992), Song Li (2000), Li Song (2002): Y t = β cos(ω 1 t + φ) + ǫ t (42) β is a positive constant. ω 1 ( π, π). φ Unif(0, π] {ǫ t } i.i.d. with mean 0 and variance σ 2 ǫ, independent of φ. If (1 η) 2 N 0 as N, then (1 η) 1/2 N(ˆω 1 ω 1 ) L N(0, γ 1 ) γ = 1 2 β2 /σ 2 ǫ The implication of this is that by a judicious choice of η, the precision of the CM estimate can be made arbitrarily close to that achieved by periodogram maximization and nonlinear least squares. More properties and discussions can be found in Li K (1993a, 1993b, 1994, 1998), K (1994), Li Song (2002). 40

41 S-Plus Code for the CM Algorithm KY.AR2 <- function(z,theta0,eta,inc,niter){ y <- rep(0,length(z)) r <- rep(0,niter); OMEGA <- rep(0,niter) r [1] <- cos(theta0); OMEGA[1] <- theta0 cat(c("initial frequency guess is", OMEGA[1]),fill=T) cat(c("eta"," r(k)"," Omega(k)", " Var(y)"), fill=t) for(k in 2:niter){# eta increments by inc eta <- eta+inc if((eta < 0) (eta >1)) stop("eta must be between 0 and 1") FiltCoeff <- c((1+eta^2)*r[k-1],-(eta^2)) y <- filter(z,filtcoeff, "rec") # CM Iterations rrr <- acf(y) # motif() must be on r[k] <- rrr$acf[2] # Gives acf(1)!!! # OMEGA[k] <- acos(r[k]) cat(c(eta,r[k],omega[k],var(y)),fill=t)}} 41

42 Example: Y t = 0.5cos(0.513t + φ 1 ) + cos(0.771t + φ 2 ) + 2.2ǫ t t = 1,...,1500. ǫ t i.i.d. N(0,1). SNR = 10log 10 ((.5 2 / /2)/2.2 2 ) = Starting at θ 0 = 0.48, η = Increment of η Final estimate is ˆω = Error: η α(k) ω(k) Var(Y t (α)) We are going to apply CM to the data without centering. 42

43 Starting at θ 0 = 0.88, η = Increment of η Final estimate is ˆω = Error: η α(k) ω(k) Var(Y t (α)) Algorithm ˆω ω Error CM FFT CM FFT

44 Example: Detection of a Diurnal Cycle in GATE I. The CM algorithm with the AR(2) parametric filter was applied to a time series of length N = 450 of hourly rain rate from GATE I (early 1970s) averaged over a region of km 2. GATE I: Hourly Averaged Rain Rate TIME 44

45 Starting at 0.29 to the right of : η = Increment of η η α(k) ω(k) Var(Y t (α)) π = Starting at 0.25 to the left of : η = Increment of η η α(k) ω(k) Var(Y t (α)) π =

46 References Barnett and Kedem (1998). Zero-crossing rates of mixtures and products of Gaussian processes. IEEE Tr. Information Theory, 44, Hadjileontiadis, L.J. (2003). Discrimination analysis of discontinuous breath sounds using higher-order crossings. Med. Biol. Eng. Comput., 41, He and Kedem (1989). Higher Order crossings Spectral Analysis of an Almost Periodic Random Sequence in Noise. IEEE Tr. Information Theory, 35, Kedem (1976). Exact Maximum Likelihood Estimation of the Parameter in the AR(1) Process After Hard Limiting. IEEE Tr. Information Theory IT-22, No. 4, Kedem (1984). On the Sinusoidal Limit of Stationary Time Series. Annals of Statistics, 12, Kedem (1986). Spectral Analysis and Discrimination by Zero- Crossings. Proceedings of the IEEE, 74, Kedem (1987). Detection of hidden periodicities by means of higher order crossings. Jour. of Time Series Analysis, 8, Kedem (1987). Higher Order Crossings in Time Series Model Identification. Technometrics, 19, Kedem (1992). Contraction mappings in mixed spectrum estimation. In New Directions in Time Series Analysis, D. Brillinger et al. eds., Springer-Verlag, NY, IMA Vol. 45, pp Kedem (1994). Time Series Analysis by Higher Order Crossings, IEEE Press, NJ. ( ) 46

47 Kedem and Li (1991). Monotone gain, first-order autocorrelation, and expected zero-crossing rate. Annals of Statistics, 19, Kedem and Lopez (1992). Fixed points in mixed spectrum analysis. In Probabilistic and Stochastic Methods in Analysis, With Applications, J. S. Byrnes et al., eds., Kluwer, Mass., pp Kedem and Slud (1981). On Goodness of Fit of Time Series Models: An Application of Higher Order Crossings. Biometrika, 68, Kedem and Slud (1982). Time Series Discrimination by Higher Order Crossings. Annals of Statistics, 10, Kedem and Slud (1994). On autocorrelation estimation in mixed spectrum Gaussian processes. Stochastic Processes and Their Applications, 49, Kedem and Troendle (1994). An iterative filtering algorithm for non-fourier frequency estimation. Jour. of Time Series Analysis, 15, Kedem and Yakowitz (1994). Practical aspects of a fast algorithm for frequency detection. IEEE Tr. Communications, 42, Kong and Qiu (2000). Injury detection for central nervous system via EEG with higher order crossing-based methods. Method. Inform. Med., 39, Li (1997). Time series characterization, Poisson integral, and robust divergence measures. em Technometrics, 39, Li (1998). Time-correlation analysis of nonstationary time series. Journal of Time Series Analysis, 19,

48 Li and Kedem (1993). Strong consistency of the contraction mapping method for frequency estimation. IEEE Tr. Information Theory, 39, Li and Kedem (1993). Asymptotic analysis of a multiple frequency estimation method. Jour. Multivariate Analysis, 46, Li and Kedem (1994). Iterative filtering for multiple frequency estimation. IEEE Tr. Signal Processing, 42, Li and Kedem (1998). Tracking abrupt frequency changes. Jour. of Time Series Analysis, 19, Li and Song (2002). Asymptotic analysis of a contraction mapping algorithm for multiple frequency estimation. IEEE Tr. Information Theory, 48, No. 10. Song and Li (2000). A statistically and computationally efficient method for frequency estimation. Stochastic Processes and Their Applications, 86, Yakowitz (1991). Some contributions to a frequency location method due to He and Kedem. IEEE Tr. Information Theory, 37,

Advanced Signal Processing and Digital Noise Reduction

Advanced Signal Processing and Digital Noise Reduction Advanced Signal Processing and Digital Noise Reduction Saeed V. Vaseghi Queen's University of Belfast UK WILEY HTEUBNER A Partnership between John Wiley & Sons and B. G. Teubner Publishers Chichester New

More information

Probability and Random Variables. Generation of random variables (r.v.)

Probability and Random Variables. Generation of random variables (r.v.) Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly

More information

Discrete Time Series Analysis with ARMA Models

Discrete Time Series Analysis with ARMA Models Discrete Time Series Analysis with ARMA Models Veronica Sitsofe Ahiati (veronica@aims.ac.za) African Institute for Mathematical Sciences (AIMS) Supervised by Tina Marquardt Munich University of Technology,

More information

Univariate and Multivariate Methods PEARSON. Addison Wesley

Univariate and Multivariate Methods PEARSON. Addison Wesley Time Series Analysis Univariate and Multivariate Methods SECOND EDITION William W. S. Wei Department of Statistics The Fox School of Business and Management Temple University PEARSON Addison Wesley Boston

More information

Chapter 8 - Power Density Spectrum

Chapter 8 - Power Density Spectrum EE385 Class Notes 8/8/03 John Stensby Chapter 8 - Power Density Spectrum Let X(t) be a WSS random process. X(t) has an average power, given in watts, of E[X(t) ], a constant. his total average power is

More information

Probabilistic properties and statistical analysis of network traffic models: research project

Probabilistic properties and statistical analysis of network traffic models: research project Probabilistic properties and statistical analysis of network traffic models: research project The problem: It is commonly accepted that teletraffic data exhibits self-similarity over a certain range of

More information

5 Signal Design for Bandlimited Channels

5 Signal Design for Bandlimited Channels 225 5 Signal Design for Bandlimited Channels So far, we have not imposed any bandwidth constraints on the transmitted passband signal, or equivalently, on the transmitted baseband signal s b (t) I[k]g

More information

1 Short Introduction to Time Series

1 Short Introduction to Time Series ECONOMICS 7344, Spring 202 Bent E. Sørensen January 24, 202 Short Introduction to Time Series A time series is a collection of stochastic variables x,.., x t,.., x T indexed by an integer value t. The

More information

Lecture 8 ELE 301: Signals and Systems

Lecture 8 ELE 301: Signals and Systems Lecture 8 ELE 3: Signals and Systems Prof. Paul Cuff Princeton University Fall 2-2 Cuff (Lecture 7) ELE 3: Signals and Systems Fall 2-2 / 37 Properties of the Fourier Transform Properties of the Fourier

More information

11. Time series and dynamic linear models

11. Time series and dynamic linear models 11. Time series and dynamic linear models Objective To introduce the Bayesian approach to the modeling and forecasting of time series. Recommended reading West, M. and Harrison, J. (1997). models, (2 nd

More information

Analysis and Computation for Finance Time Series - An Introduction

Analysis and Computation for Finance Time Series - An Introduction ECMM703 Analysis and Computation for Finance Time Series - An Introduction Alejandra González Harrison 161 Email: mag208@exeter.ac.uk Time Series - An Introduction A time series is a sequence of observations

More information

Introduction to Time Series Analysis. Lecture 1.

Introduction to Time Series Analysis. Lecture 1. Introduction to Time Series Analysis. Lecture 1. Peter Bartlett 1. Organizational issues. 2. Objectives of time series analysis. Examples. 3. Overview of the course. 4. Time series models. 5. Time series

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Identification of univariate time series models, cont.:

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 6 Three Approaches to Classification Construct

More information

Time Series Analysis

Time Series Analysis JUNE 2012 Time Series Analysis CONTENT A time series is a chronological sequence of observations on a particular variable. Usually the observations are taken at regular intervals (days, months, years),

More information

Time Series Analysis in Economics. Klaus Neusser

Time Series Analysis in Economics. Klaus Neusser Time Series Analysis in Economics Klaus Neusser May 26, 2015 Contents I Univariate Time Series Analysis 3 1 Introduction 1 1.1 Some examples.......................... 2 1.2 Formal definitions.........................

More information

Chapter 10 Introduction to Time Series Analysis

Chapter 10 Introduction to Time Series Analysis Chapter 1 Introduction to Time Series Analysis A time series is a collection of observations made sequentially in time. Examples are daily mortality counts, particulate air pollution measurements, and

More information

Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay Business cycle plays an important role in economics. In time series analysis, business cycle

More information

G.A. Pavliotis. Department of Mathematics. Imperial College London

G.A. Pavliotis. Department of Mathematics. Imperial College London EE1 MATHEMATICS NUMERICAL METHODS G.A. Pavliotis Department of Mathematics Imperial College London 1. Numerical solution of nonlinear equations (iterative processes). 2. Numerical evaluation of integrals.

More information

Matrices and Polynomials

Matrices and Polynomials APPENDIX 9 Matrices and Polynomials he Multiplication of Polynomials Let α(z) =α 0 +α 1 z+α 2 z 2 + α p z p and y(z) =y 0 +y 1 z+y 2 z 2 + y n z n be two polynomials of degrees p and n respectively. hen,

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt June 5, 2015 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Autoregressive, MA and ARMA processes Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 212 Alonso and García-Martos

More information

Summary Nonstationary Time Series Multitude of Representations Possibilities from Applied Computational Harmonic Analysis Tests of Stationarity

Summary Nonstationary Time Series Multitude of Representations Possibilities from Applied Computational Harmonic Analysis Tests of Stationarity Nonstationary Time Series, Priestley s Evolutionary Spectra and Wavelets Guy Nason, School of Mathematics, University of Bristol Summary Nonstationary Time Series Multitude of Representations Possibilities

More information

Some stability results of parameter identification in a jump diffusion model

Some stability results of parameter identification in a jump diffusion model Some stability results of parameter identification in a jump diffusion model D. Düvelmeyer Technische Universität Chemnitz, Fakultät für Mathematik, 09107 Chemnitz, Germany Abstract In this paper we discuss

More information

Sales forecasting # 2

Sales forecasting # 2 Sales forecasting # 2 Arthur Charpentier arthur.charpentier@univ-rennes1.fr 1 Agenda Qualitative and quantitative methods, a very general introduction Series decomposition Short versus long term forecasting

More information

NRZ Bandwidth - HF Cutoff vs. SNR

NRZ Bandwidth - HF Cutoff vs. SNR Application Note: HFAN-09.0. Rev.2; 04/08 NRZ Bandwidth - HF Cutoff vs. SNR Functional Diagrams Pin Configurations appear at end of data sheet. Functional Diagrams continued at end of data sheet. UCSP

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION Introduction In the previous chapter, we explored a class of regression models having particularly simple analytical

More information

Time Series HILARY TERM 2010 PROF. GESINE REINERT http://www.stats.ox.ac.uk/~reinert

Time Series HILARY TERM 2010 PROF. GESINE REINERT http://www.stats.ox.ac.uk/~reinert Time Series HILARY TERM 2010 PROF. GESINE REINERT http://www.stats.ox.ac.uk/~reinert Overview Chapter 1: What are time series? Types of data, examples, objectives. Definitions, stationarity and autocovariances.

More information

CCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York

CCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York BME I5100: Biomedical Signal Processing Linear Discrimination Lucas C. Parra Biomedical Engineering Department CCNY 1 Schedule Week 1: Introduction Linear, stationary, normal - the stuff biology is not

More information

Time Series Analysis in WinIDAMS

Time Series Analysis in WinIDAMS Time Series Analysis in WinIDAMS P.S. Nagpaul, New Delhi, India April 2005 1 Introduction A time series is a sequence of observations, which are ordered in time (or space). In electrical engineering literature,

More information

Basics of Statistical Machine Learning

Basics of Statistical Machine Learning CS761 Spring 2013 Advanced Machine Learning Basics of Statistical Machine Learning Lecturer: Xiaojin Zhu jerryzhu@cs.wisc.edu Modern machine learning is rooted in statistics. You will find many familiar

More information

Financial TIme Series Analysis: Part II

Financial TIme Series Analysis: Part II Department of Mathematics and Statistics, University of Vaasa, Finland January 29 February 13, 2015 Feb 14, 2015 1 Univariate linear stochastic models: further topics Unobserved component model Signal

More information

Analysis/resynthesis with the short time Fourier transform

Analysis/resynthesis with the short time Fourier transform Analysis/resynthesis with the short time Fourier transform summer 2006 lecture on analysis, modeling and transformation of audio signals Axel Röbel Institute of communication science TU-Berlin IRCAM Analysis/Synthesis

More information

S. Boyd EE102. Lecture 1 Signals. notation and meaning. common signals. size of a signal. qualitative properties of signals.

S. Boyd EE102. Lecture 1 Signals. notation and meaning. common signals. size of a signal. qualitative properties of signals. S. Boyd EE102 Lecture 1 Signals notation and meaning common signals size of a signal qualitative properties of signals impulsive signals 1 1 Signals a signal is a function of time, e.g., f is the force

More information

How To Calculate Max Likelihood For A Software Reliability Model

How To Calculate Max Likelihood For A Software Reliability Model International Journal of Reliability, Quality and Safety Engineering c World Scientific Publishing Company MAXIMUM LIKELIHOOD ESTIMATES FOR THE HYPERGEOMETRIC SOFTWARE RELIABILITY MODEL FRANK PADBERG Fakultät

More information

Corrected Diffusion Approximations for the Maximum of Heavy-Tailed Random Walk

Corrected Diffusion Approximations for the Maximum of Heavy-Tailed Random Walk Corrected Diffusion Approximations for the Maximum of Heavy-Tailed Random Walk Jose Blanchet and Peter Glynn December, 2003. Let (X n : n 1) be a sequence of independent and identically distributed random

More information

Lectures on Stochastic Processes. William G. Faris

Lectures on Stochastic Processes. William G. Faris Lectures on Stochastic Processes William G. Faris November 8, 2001 2 Contents 1 Random walk 7 1.1 Symmetric simple random walk................... 7 1.2 Simple random walk......................... 9 1.3

More information

Solutions to Exam in Speech Signal Processing EN2300

Solutions to Exam in Speech Signal Processing EN2300 Solutions to Exam in Speech Signal Processing EN23 Date: Thursday, Dec 2, 8: 3: Place: Allowed: Grades: Language: Solutions: Q34, Q36 Beta Math Handbook (or corresponding), calculator with empty memory.

More information

Statistics Graduate Courses

Statistics Graduate Courses Statistics Graduate Courses STAT 7002--Topics in Statistics-Biological/Physical/Mathematics (cr.arr.).organized study of selected topics. Subjects and earnable credit may vary from semester to semester.

More information

TTT4120 Digital Signal Processing Suggested Solution to Exam Fall 2008

TTT4120 Digital Signal Processing Suggested Solution to Exam Fall 2008 Norwegian University of Science and Technology Department of Electronics and Telecommunications TTT40 Digital Signal Processing Suggested Solution to Exam Fall 008 Problem (a) The input and the input-output

More information

Testing against a Change from Short to Long Memory

Testing against a Change from Short to Long Memory Testing against a Change from Short to Long Memory Uwe Hassler and Jan Scheithauer Goethe-University Frankfurt This version: December 9, 2007 Abstract This paper studies some well-known tests for the null

More information

Traffic Safety Facts. Research Note. Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen

Traffic Safety Facts. Research Note. Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen Traffic Safety Facts Research Note March 2004 DOT HS 809 718 Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen Summary This research note uses

More information

State Space Time Series Analysis

State Space Time Series Analysis State Space Time Series Analysis p. 1 State Space Time Series Analysis Siem Jan Koopman http://staff.feweb.vu.nl/koopman Department of Econometrics VU University Amsterdam Tinbergen Institute 2011 State

More information

Grey Brownian motion and local times

Grey Brownian motion and local times Grey Brownian motion and local times José Luís da Silva 1,2 (Joint work with: M. Erraoui 3 ) 2 CCM - Centro de Ciências Matemáticas, University of Madeira, Portugal 3 University Cadi Ayyad, Faculty of

More information

Statistical Machine Learning

Statistical Machine Learning Statistical Machine Learning UoC Stats 37700, Winter quarter Lecture 4: classical linear and quadratic discriminants. 1 / 25 Linear separation For two classes in R d : simple idea: separate the classes

More information

Review of Fourier series formulas. Representation of nonperiodic functions. ECE 3640 Lecture 5 Fourier Transforms and their properties

Review of Fourier series formulas. Representation of nonperiodic functions. ECE 3640 Lecture 5 Fourier Transforms and their properties ECE 3640 Lecture 5 Fourier Transforms and their properties Objective: To learn about Fourier transforms, which are a representation of nonperiodic functions in terms of trigonometric functions. Also, to

More information

Introduction to time series analysis

Introduction to time series analysis Introduction to time series analysis Margherita Gerolimetto November 3, 2010 1 What is a time series? A time series is a collection of observations ordered following a parameter that for us is time. Examples

More information

Load Balancing and Switch Scheduling

Load Balancing and Switch Scheduling EE384Y Project Final Report Load Balancing and Switch Scheduling Xiangheng Liu Department of Electrical Engineering Stanford University, Stanford CA 94305 Email: liuxh@systems.stanford.edu Abstract Load

More information

Testing against a Change from Short to Long Memory

Testing against a Change from Short to Long Memory Testing against a Change from Short to Long Memory Uwe Hassler and Jan Scheithauer Goethe-University Frankfurt This version: January 2, 2008 Abstract This paper studies some well-known tests for the null

More information

Time Series Analysis

Time Series Analysis Time Series 1 April 9, 2013 Time Series Analysis This chapter presents an introduction to the branch of statistics known as time series analysis. Often the data we collect in environmental studies is collected

More information

On the Traffic Capacity of Cellular Data Networks. 1 Introduction. T. Bonald 1,2, A. Proutière 1,2

On the Traffic Capacity of Cellular Data Networks. 1 Introduction. T. Bonald 1,2, A. Proutière 1,2 On the Traffic Capacity of Cellular Data Networks T. Bonald 1,2, A. Proutière 1,2 1 France Telecom Division R&D, 38-40 rue du Général Leclerc, 92794 Issy-les-Moulineaux, France {thomas.bonald, alexandre.proutiere}@francetelecom.com

More information

Lecture 2: ARMA(p,q) models (part 3)

Lecture 2: ARMA(p,q) models (part 3) Lecture 2: ARMA(p,q) models (part 3) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Bayesian logistic betting strategy against probability forecasting. Akimichi Takemura, Univ. Tokyo. November 12, 2012

Bayesian logistic betting strategy against probability forecasting. Akimichi Takemura, Univ. Tokyo. November 12, 2012 Bayesian logistic betting strategy against probability forecasting Akimichi Takemura, Univ. Tokyo (joint with Masayuki Kumon, Jing Li and Kei Takeuchi) November 12, 2012 arxiv:1204.3496. To appear in Stochastic

More information

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written

More information

(Refer Slide Time: 01:11-01:27)

(Refer Slide Time: 01:11-01:27) Digital Signal Processing Prof. S. C. Dutta Roy Department of Electrical Engineering Indian Institute of Technology, Delhi Lecture - 6 Digital systems (contd.); inverse systems, stability, FIR and IIR,

More information

1 if 1 x 0 1 if 0 x 1

1 if 1 x 0 1 if 0 x 1 Chapter 3 Continuity In this chapter we begin by defining the fundamental notion of continuity for real valued functions of a single real variable. When trying to decide whether a given function is or

More information

Adaptive Equalization of binary encoded signals Using LMS Algorithm

Adaptive Equalization of binary encoded signals Using LMS Algorithm SSRG International Journal of Electronics and Communication Engineering (SSRG-IJECE) volume issue7 Sep Adaptive Equalization of binary encoded signals Using LMS Algorithm Dr.K.Nagi Reddy Professor of ECE,NBKR

More information

A spot price model feasible for electricity forward pricing Part II

A spot price model feasible for electricity forward pricing Part II A spot price model feasible for electricity forward pricing Part II Fred Espen Benth Centre of Mathematics for Applications (CMA) University of Oslo, Norway Wolfgang Pauli Institute, Wien January 17-18

More information

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION No: CITY UNIVERSITY LONDON BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION ENGINEERING MATHEMATICS 2 (resit) EX2005 Date: August

More information

t := maxγ ν subject to ν {0,1,2,...} and f(x c +γ ν d) f(x c )+cγ ν f (x c ;d).

t := maxγ ν subject to ν {0,1,2,...} and f(x c +γ ν d) f(x c )+cγ ν f (x c ;d). 1. Line Search Methods Let f : R n R be given and suppose that x c is our current best estimate of a solution to P min x R nf(x). A standard method for improving the estimate x c is to choose a direction

More information

Time Series Analysis III

Time Series Analysis III Lecture 12: Time Series Analysis III MIT 18.S096 Dr. Kempthorne Fall 2013 MIT 18.S096 Time Series Analysis III 1 Outline Time Series Analysis III 1 Time Series Analysis III MIT 18.S096 Time Series Analysis

More information

Maximum Likelihood Estimation of ADC Parameters from Sine Wave Test Data. László Balogh, Balázs Fodor, Attila Sárhegyi, and István Kollár

Maximum Likelihood Estimation of ADC Parameters from Sine Wave Test Data. László Balogh, Balázs Fodor, Attila Sárhegyi, and István Kollár Maximum Lielihood Estimation of ADC Parameters from Sine Wave Test Data László Balogh, Balázs Fodor, Attila Sárhegyi, and István Kollár Dept. of Measurement and Information Systems Budapest University

More information

Analysis of Mean-Square Error and Transient Speed of the LMS Adaptive Algorithm

Analysis of Mean-Square Error and Transient Speed of the LMS Adaptive Algorithm IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 7, JULY 2002 1873 Analysis of Mean-Square Error Transient Speed of the LMS Adaptive Algorithm Onkar Dabeer, Student Member, IEEE, Elias Masry, Fellow,

More information

Time Series Analysis 1. Lecture 8: Time Series Analysis. Time Series Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013 MIT 18.S096

Time Series Analysis 1. Lecture 8: Time Series Analysis. Time Series Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013 MIT 18.S096 Lecture 8: Time Series Analysis MIT 18.S096 Dr. Kempthorne Fall 2013 MIT 18.S096 Time Series Analysis 1 Outline Time Series Analysis 1 Time Series Analysis MIT 18.S096 Time Series Analysis 2 A stochastic

More information

Performance of Quasi-Constant Envelope Phase Modulation through Nonlinear Radio Channels

Performance of Quasi-Constant Envelope Phase Modulation through Nonlinear Radio Channels Performance of Quasi-Constant Envelope Phase Modulation through Nonlinear Radio Channels Qi Lu, Qingchong Liu Electrical and Systems Engineering Department Oakland University Rochester, MI 48309 USA E-mail:

More information

Studying Achievement

Studying Achievement Journal of Business and Economics, ISSN 2155-7950, USA November 2014, Volume 5, No. 11, pp. 2052-2056 DOI: 10.15341/jbe(2155-7950)/11.05.2014/009 Academic Star Publishing Company, 2014 http://www.academicstar.us

More information

Estimating the Degree of Activity of jumps in High Frequency Financial Data. joint with Yacine Aït-Sahalia

Estimating the Degree of Activity of jumps in High Frequency Financial Data. joint with Yacine Aït-Sahalia Estimating the Degree of Activity of jumps in High Frequency Financial Data joint with Yacine Aït-Sahalia Aim and setting An underlying process X = (X t ) t 0, observed at equally spaced discrete times

More information

General Theory of Differential Equations Sections 2.8, 3.1-3.2, 4.1

General Theory of Differential Equations Sections 2.8, 3.1-3.2, 4.1 A B I L E N E C H R I S T I A N U N I V E R S I T Y Department of Mathematics General Theory of Differential Equations Sections 2.8, 3.1-3.2, 4.1 Dr. John Ehrke Department of Mathematics Fall 2012 Questions

More information

A Coefficient of Variation for Skewed and Heavy-Tailed Insurance Losses. Michael R. Powers[ 1 ] Temple University and Tsinghua University

A Coefficient of Variation for Skewed and Heavy-Tailed Insurance Losses. Michael R. Powers[ 1 ] Temple University and Tsinghua University A Coefficient of Variation for Skewed and Heavy-Tailed Insurance Losses Michael R. Powers[ ] Temple University and Tsinghua University Thomas Y. Powers Yale University [June 2009] Abstract We propose a

More information

Rob J Hyndman. Forecasting using. 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1

Rob J Hyndman. Forecasting using. 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1 Rob J Hyndman Forecasting using 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1 Outline 1 Regression with ARIMA errors 2 Example: Japanese cars 3 Using Fourier terms for seasonality 4

More information

FFT Algorithms. Chapter 6. Contents 6.1

FFT Algorithms. Chapter 6. Contents 6.1 Chapter 6 FFT Algorithms Contents Efficient computation of the DFT............................................ 6.2 Applications of FFT................................................... 6.6 Computing DFT

More information

System Identification for Acoustic Comms.:

System Identification for Acoustic Comms.: System Identification for Acoustic Comms.: New Insights and Approaches for Tracking Sparse and Rapidly Fluctuating Channels Weichang Li and James Preisig Woods Hole Oceanographic Institution The demodulation

More information

Threshold Autoregressive Models in Finance: A Comparative Approach

Threshold Autoregressive Models in Finance: A Comparative Approach University of Wollongong Research Online Applied Statistics Education and Research Collaboration (ASEARC) - Conference Papers Faculty of Informatics 2011 Threshold Autoregressive Models in Finance: A Comparative

More information

3. Regression & Exponential Smoothing

3. Regression & Exponential Smoothing 3. Regression & Exponential Smoothing 3.1 Forecasting a Single Time Series Two main approaches are traditionally used to model a single time series z 1, z 2,..., z n 1. Models the observation z t as a

More information

Nonparametric adaptive age replacement with a one-cycle criterion

Nonparametric adaptive age replacement with a one-cycle criterion Nonparametric adaptive age replacement with a one-cycle criterion P. Coolen-Schrijner, F.P.A. Coolen Department of Mathematical Sciences University of Durham, Durham, DH1 3LE, UK e-mail: Pauline.Schrijner@durham.ac.uk

More information

The CUSUM algorithm a small review. Pierre Granjon

The CUSUM algorithm a small review. Pierre Granjon The CUSUM algorithm a small review Pierre Granjon June, 1 Contents 1 The CUSUM algorithm 1.1 Algorithm............................... 1.1.1 The problem......................... 1.1. The different steps......................

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

Convolution, Correlation, & Fourier Transforms. James R. Graham 10/25/2005

Convolution, Correlation, & Fourier Transforms. James R. Graham 10/25/2005 Convolution, Correlation, & Fourier Transforms James R. Graham 10/25/2005 Introduction A large class of signal processing techniques fall under the category of Fourier transform methods These methods fall

More information

Design of FIR Filters

Design of FIR Filters Design of FIR Filters Elena Punskaya www-sigproc.eng.cam.ac.uk/~op205 Some material adapted from courses by Prof. Simon Godsill, Dr. Arnaud Doucet, Dr. Malcolm Macleod and Prof. Peter Rayner 68 FIR as

More information

TIME series arise in many problems in signal processing,

TIME series arise in many problems in signal processing, IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL 52, NO 8, AUGUST 2004 2189 Learning Graphical Models for Stationary Time Series Francis R Bach and Michael I Jordan Abstract Probabilistic graphical models can

More information

A course in Time Series Analysis. Suhasini Subba Rao Email: suhasini.subbarao@stat.tamu.edu

A course in Time Series Analysis. Suhasini Subba Rao Email: suhasini.subbarao@stat.tamu.edu A course in Time Series Analysis Suhasini Subba Rao Email: suhasini.subbarao@stat.tamu.edu August 5, 205 Contents Introduction 8. Time Series data................................. 8.. R code....................................2

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 7, JULY 2005 2475. G. George Yin, Fellow, IEEE, and Vikram Krishnamurthy, Fellow, IEEE

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 7, JULY 2005 2475. G. George Yin, Fellow, IEEE, and Vikram Krishnamurthy, Fellow, IEEE IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 7, JULY 2005 2475 LMS Algorithms for Tracking Slow Markov Chains With Applications to Hidden Markov Estimation and Adaptive Multiuser Detection G.

More information

Math 526: Brownian Motion Notes

Math 526: Brownian Motion Notes Math 526: Brownian Motion Notes Definition. Mike Ludkovski, 27, all rights reserved. A stochastic process (X t ) is called Brownian motion if:. The map t X t (ω) is continuous for every ω. 2. (X t X t

More information

Dirichlet Processes A gentle tutorial

Dirichlet Processes A gentle tutorial Dirichlet Processes A gentle tutorial SELECT Lab Meeting October 14, 2008 Khalid El-Arini Motivation We are given a data set, and are told that it was generated from a mixture of Gaussian distributions.

More information

Probability and statistics; Rehearsal for pattern recognition

Probability and statistics; Rehearsal for pattern recognition Probability and statistics; Rehearsal for pattern recognition Václav Hlaváč Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics Center for Machine Perception

More information

Implementation of Digital Signal Processing: Some Background on GFSK Modulation

Implementation of Digital Signal Processing: Some Background on GFSK Modulation Implementation of Digital Signal Processing: Some Background on GFSK Modulation Sabih H. Gerez University of Twente, Department of Electrical Engineering s.h.gerez@utwente.nl Version 4 (February 7, 2013)

More information

Modeling Heterogeneous Network Traffic in Wavelet Domain

Modeling Heterogeneous Network Traffic in Wavelet Domain 634 IEEE/ACM TRANSACTIONS ON NETWORKING, VOL. 9, NO. 5, OCTOBER 2001 Modeling Heterogeneous Network Traffic in Wavelet Domain Sheng Ma, Member, IEEE, Chuanyi Ji Abstract Heterogeneous network traffic possesses

More information

Coding and decoding with convolutional codes. The Viterbi Algor

Coding and decoding with convolutional codes. The Viterbi Algor Coding and decoding with convolutional codes. The Viterbi Algorithm. 8 Block codes: main ideas Principles st point of view: infinite length block code nd point of view: convolutions Some examples Repetition

More information

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Relations Between Time Domain and Frequency Domain Prediction Error Methods - Tomas McKelvey

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Relations Between Time Domain and Frequency Domain Prediction Error Methods - Tomas McKelvey COTROL SYSTEMS, ROBOTICS, AD AUTOMATIO - Vol. V - Relations Between Time Domain and Frequency Domain RELATIOS BETWEE TIME DOMAI AD FREQUECY DOMAI PREDICTIO ERROR METHODS Tomas McKelvey Signal Processing,

More information

TCOM 370 NOTES 99-4 BANDWIDTH, FREQUENCY RESPONSE, AND CAPACITY OF COMMUNICATION LINKS

TCOM 370 NOTES 99-4 BANDWIDTH, FREQUENCY RESPONSE, AND CAPACITY OF COMMUNICATION LINKS TCOM 370 NOTES 99-4 BANDWIDTH, FREQUENCY RESPONSE, AND CAPACITY OF COMMUNICATION LINKS 1. Bandwidth: The bandwidth of a communication link, or in general any system, was loosely defined as the width of

More information

Power Spectral Density

Power Spectral Density C H A P E R 0 Power Spectral Density INRODUCION Understanding how the strength of a signal is distributed in the frequency domain, relative to the strengths of other ambient signals, is central to the

More information

Lecture 10: Regression Trees

Lecture 10: Regression Trees Lecture 10: Regression Trees 36-350: Data Mining October 11, 2006 Reading: Textbook, sections 5.2 and 10.5. The next three lectures are going to be about a particular kind of nonlinear predictive model,

More information

Software Review: ITSM 2000 Professional Version 6.0.

Software Review: ITSM 2000 Professional Version 6.0. Lee, J. & Strazicich, M.C. (2002). Software Review: ITSM 2000 Professional Version 6.0. International Journal of Forecasting, 18(3): 455-459 (June 2002). Published by Elsevier (ISSN: 0169-2070). http://0-

More information

Signal Detection C H A P T E R 14 14.1 SIGNAL DETECTION AS HYPOTHESIS TESTING

Signal Detection C H A P T E R 14 14.1 SIGNAL DETECTION AS HYPOTHESIS TESTING C H A P T E R 4 Signal Detection 4. SIGNAL DETECTION AS HYPOTHESIS TESTING In Chapter 3 we considered hypothesis testing in the context of random variables. The detector resulting in the minimum probability

More information

Review of Random Variables

Review of Random Variables Chapter 1 Review of Random Variables Updated: January 16, 2015 This chapter reviews basic probability concepts that are necessary for the modeling and statistical analysis of financial data. 1.1 Random

More information

Angle Modulation, II. Lecture topics FM bandwidth and Carson s rule. Spectral analysis of FM. Narrowband FM Modulation. Wideband FM Modulation

Angle Modulation, II. Lecture topics FM bandwidth and Carson s rule. Spectral analysis of FM. Narrowband FM Modulation. Wideband FM Modulation Angle Modulation, II EE 179, Lecture 12, Handout #19 Lecture topics FM bandwidth and Carson s rule Spectral analysis of FM Narrowband FM Modulation Wideband FM Modulation EE 179, April 25, 2014 Lecture

More information

1 Prior Probability and Posterior Probability

1 Prior Probability and Posterior Probability Math 541: Statistical Theory II Bayesian Approach to Parameter Estimation Lecturer: Songfeng Zheng 1 Prior Probability and Posterior Probability Consider now a problem of statistical inference in which

More information

Examples. David Ruppert. April 25, 2009. Cornell University. Statistics for Financial Engineering: Some R. Examples. David Ruppert.

Examples. David Ruppert. April 25, 2009. Cornell University. Statistics for Financial Engineering: Some R. Examples. David Ruppert. Cornell University April 25, 2009 Outline 1 2 3 4 A little about myself BA and MA in mathematics PhD in statistics in 1977 taught in the statistics department at North Carolina for 10 years have been in

More information

Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations

Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations 56 Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations Florin-Cătălin ENACHE

More information

Estimating an ARMA Process

Estimating an ARMA Process Statistics 910, #12 1 Overview Estimating an ARMA Process 1. Main ideas 2. Fitting autoregressions 3. Fitting with moving average components 4. Standard errors 5. Examples 6. Appendix: Simple estimators

More information