# Analysis and Computation for Finance Time Series - An Introduction

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 ECMM703 Analysis and Computation for Finance Time Series - An Introduction Alejandra González Harrison

2 Time Series - An Introduction A time series is a sequence of observations ordered in time; observations are numbers (e.g. measurements). Time series analysis comprises methods that attempt to: understand the underlying context of the data (where did they come from? what generated them?); make forecasts (predictions). 1

3 Definitions/Setting A stochastic process is a collection of random variables {Y t : t T} defined on a probability space (Ω, F, P). In time series modelling, a sequence of observations is considered as one realisation of an unknown stochastic process: 1. can we infer properties of this process? 2. can we predict its future behaviour? By time series we shall mean both the sequence of observations and the process of which it is a realization (language abuse). We will only consider discrete time series: observations (y 1,..., y N ) of a variable at different times (y i = y(t i ), say). 2

4 Setting (cont.) We will only deal with time series observed at regular time points (days, months etc.). We focus on pure univariate time series models: a single time series (y 1,..., y N ) is modelled in terms of its own values and their order in time. No external factors are considered. Modelling of time series which: are measured at irregular time points, or are made up of several observations at each time point (multivariate data), or involve explanatory variables x t measured at each time point, is based upon the ideas presented here. 3

5 Work plan We provide an overview of pure univariate time series models: ARMA ( Box-Jenkins ) models; ARIMA models; GARCH models. Models will be implemented in the public domain general purpose statistical language R. 4

6 References 1. Anderson, O. D. Time series analysis and forecasting The Box- Jenkins approach. Butterworths, London-Boston, Mass., Box, George E. P. and Jenkins, Gwilym M. Time series analysis: forecasting and control. Holden-Day, San Francisco, Calif Brockwell, Peter J. and Davis, Richard A. Time series: theory and methods, Second Edition. Springer Series in Statistics, Springer- Verlag Jonathan D. Cryer Time series Analysis. PWS-KENT Publishing Company, Boston R webpage: 6. Time Series Analysis and Its Applications: With R Examples. 5

7 Statistical versus Time series modelling Problem: Given a time series (y 1, y 2,..., y N ): (i) determine temporal structure and patterns; (ii) forecast non-observed values. Approach: Construct a mathematical model for the data. In statistical modelling it is typically assumed that the observations (y 1,..., y N ) are a sample from a sequence of independent random variables. Then there is no covariance (or correlation) structure between the observations; in other words, the joint probability distribution for the data is just the product of the univariate probability distributions for each observation; we are mostly concerned with estimation of the mean behaviour µ i and the variance σ 2 i of the error about the mean, errors being unrelated to each other. 6

8 Statistical vs. Time series modelling (cont.) However, for a time series we cannot assume that the observations (y 1, y 2,..., y N ) are independent: the data will be serially correlated or auto-correlated, rather than independent. Since we want to understand/predict the data, we need to explain/use the correlation structure between observations. Hence, we need stochastic processes with a correlation structure over time in their random component. Thus we need to directly consider the joint multivariate distribution for the data, p(y 1,..., y N ), rather than just each marginal distribution p(y t ). 7

9 Time series modelling If one could assume joint normality of (y 1,..., y N ) then the joint distribution, p(y 1,..., y N ), would be completely characterised by: the means: µ = (µ 1, µ 2,..., µ N ); the auto-covariance matrix Σ, i.e. the N N matrix with entries σ ij = cov(y i, y j ) = E[(y i µ i )(y j µ j )]. In practice joint normality is not an appropriate assumption for most time series (certainly not for most financial time series). Nevertheless, in many cases knowledge of µ and Σ will be sufficient to capture the major properties of the time series. 8

10 Time series modelling (cont.) Thus the focus in time series analysis reduces to understand the mean µ and the autocovariance Σ of the generating process (weakly stationary time series). In the applications both µ and Σ are unknown and so must be estimated from the data. There are N elements involved in the mean component µ and N(N + 1)/2 distinct elements in Σ: vastly too many distinct unknowns to estimate without some further restrictions. To reduce the number of unknowns, we have to introduce parametric structure so that the modelling becomes manageable. 9

11 Strict Stationarity The time series {Y t : t Z} is strictly stationary if the joint distributions of (Y t1,..., Y tk ) and (Y t1+τ,..., Y tk+τ ) are the same for all positive integers k and all t 1,..., t k, τ Z. Equivalently, the time series {Y t : t Z} is strictly stationary if the random vectors (Y 1,..., Y k ) and (Y 1+τ, Y 2+τ,..., Y k+τ ) have the same joint probability distribution for any time shift τ, Taking k = 1 yields that Y t has the same distribution for all t. If E[ Y t 2 ] <, then E[Y t ] and Var(Y t ) are both constant. Taking k = 2, we find that Y t and Y t+h have the same joint distribution and hence cov(y t, Y t+h ) is the same for all h. 10

12 Weak Stationarity Let {Y t : t Z} be a stochastic process with mean µ t and variance σt 2 <, for each t. Then, the autocovariance function is defined by: γ(t, s) = cov(y t, Y s ) = E[(Y t µ t )(Y s µ s )]. The stochastic process {Y t : t Z} is weak stationary if for all t Z the following holds: E [ Y t 2] <, E[Y t ] = m; γ(r, s) = γ(r + t, s + t) for all r, s Z. Notice that the autocovariance function of a weak stationary process is a function of only the time shift (or lag) τ Z: γ τ = γ(τ,0) = cov ( Y t+τ, Y t ), for all t Z. In particular the variance is independent of time: Var(Y t ) = γ 0. 11

13 Autocorrelation Let {Y t : t Z} be a stochastic process with mean µ t and variance σt 2 <, for each t. Then, the autocorrelation is defined by: ρ(t, s) = cov(y t, Y s ) σ t σ s = γ(t, s) σ t σ s. If the function ρ(t, s) is well-defined, its value must lie in the range [ 1,1], with 1 indicating perfect correlation and -1 indicating perfect anti-correlation. The autocorrelation describes the correlation between the process at different points in time. 12

14 Autocorrelation Function (ACF) If {Y t : t Z} is weak stationary then the autocorrelation depends only on the lag τ Z: ρ τ = cov ( Y t+τ, Y t σ τ σ τ ) = γ τ σ2, for all t Z, where σ 2 = γ 0 denotes the variance of the process. So weak stationarity (and therefore also strict stationarity) implies auto-correlations depend only on the lag τ and this relationship is referred to as the auto-correlation function (ACF) of the process. 13

15 Partial Autocorrelation Functions (PACF) For a weak stationary process {Y t : t Z}, the PACF α k at lag k may be regarded as the correlation between Y 1 and Y 1+k, adjusted for the intervening observations Y 1, Y 2,..., Y k 1. For k 2 the PACF is the correlation of the two residuals obtained after regressing Y k and Y 1 on the intermediate observations Y 2, Y 3,..., Y k. The PACF at lag k is defined by α k = ψ kk, k 1, where ψ kk is uniquely determined by: ρ 0 ρ 1 ρ 2... ρ k 1 ρ 1. ρ 0 ρ 1... ρ k 2. ρ k 1 ρ k 2 ρ k 3... ρ 0 ψ k1 ψ k2. ψ kk = ρ 1 ρ 2. ρ k. 14

16 Stationary models Assuming weak stationarity, modelling a time series reduces to estimation of a constant mean µ = µ t and of a covariance matrix: Σ = σ 2 1 ρ 1 ρ 2... ρ N 1 ρ 1 1 ρ 1... ρ N 2 ρ 2. ρ ρ N ρ N 1 ρ N 2 ρ N There are many fewer parameters in Σ (N 1) than in an arbitrary, unrestricted covariance matrix. Still, for large N the estimation can be problematic without additional structure in Σ, to further reduce the number of parameters. 15

17 Auto-regressive Moving Average (ARMA) processes Weak stationary Auto-regressive Moving Average (ARMA) processes allow reduction to a manageable number of parameters. The simple structure of ARMA processes makes them very useful and flexible models for weak stationary time series (y 1,..., y N ). We assume that y t has zero mean. Incorporation of non-zero mean is straightforward. Modelling of non-stationary data is based on variations of ARMA models. 16

18 ARMA Modelling First order auto-regressive processes: AR(1) The simplest example from the ARMA family is the first-order auto-regressive process denoted AR(1) i.e. y t = ϕ 1 y t 1 + ǫ t. (1) Here ǫ t constitute a white noise process i.e. zero mean random shocks or innovations assumed to be independent of each other and identically distributed with constant variance σ 2 ǫ. Equation (1) can be written symbolically in the more compact form ϕ(b)y t = ǫ t, where ϕ(z) = 1 ϕ 1 z and B is the backward shift or lag-operator defined by B m y t = y t m. 17

19 AR(1) (cont.) The stationarity condition in an AR(1) process y t = ϕ 1 y t 1 + ǫ t amounts to ϕ 1 < 1. Equivalently, ϕ(z) = 1 ϕ 1 z 0, for all z C such that z 1. By slight rearrangement and using the lag-operator, the AR(1) model (1 ϕ 1 B)y t = ǫ t can be written as: y t = (1 ϕ 1 B) 1 ǫ t = (1 + ϕ 1 B + ϕ 2 1 B2 + ϕ 3 1 B3 +...)ǫ t. Notice that this series representation will converge as long as ϕ 1 < 1. 18

20 AR(1) (cont.) For the AR(1) process it can be shown that: Var(y t ) = γ 0 = σ 2 ǫ (1 + ϕ2 1 + ϕ ) = σ2 ǫ (1 ϕ 2 1 ), cov(y t, y t k ) = γ k = γ k 1 ϕ 1, k > 0, ρ k = γ k γ 0 = ϕ k 1. Since ϕ 1 < 1, the ACF ρ k shows a pattern which is decreasing in absolute value. This implies that the linear dependence of two observations y t and y s becomes weaker with increasing time distance between t and s. If 0 < ϕ 1 < 1, the ACF decays exponentially to zero, while if 1 < ϕ 1 < 0, the ACF decays in an oscillatory manner. Both decays are slow if ϕ is close to the non-stationary boundaries ±1. 19

21 ACF AR(1), phi= Time Lag 20

22 ACF AR(1), phi= Time Lag 21

23 ACF AR(1), phi= Time Lag 22

24 First order moving average processes: MA(1) A first-order moving-average process, MA(1), is defined by: y t = ǫ t θ 1 ǫ t 1 = (1 θ 1 B)ǫ t. For the MA(1) process it can be shown that: Var(y t ) = γ 0 = (1 + θ 2 1 )σ2 ǫ cov(y t, y t 1 ) = γ 1 = θ 1 σ 2 ǫ cov(y t, y t k ) = 0, k > 1 ρ 1 = γ 1 γ 0 = θ 1 (1 + θ 2 1 ) ρ k = 0, k > 1. Note: the two observations y t and y s generated by a MA(1) process are uncorrelated if t and s are more than one observation apart. 23

25 ACF MA(1), theta= Time Lag 24

26 MA(1), theta= Time ACF Lag 25

27 AR(p) and MA(q) processes Both the AR(1) and MA(1) processes impose strong restrictions on the pattern of the corresponding ACF. More general ACF patterns are allowed by autoregressive or moving average models of higher order. The AR(p) and MA(q) models are defined as follows: y t = ϕ 1 y t 1 + ϕ 2 y t ϕ p y t p + ǫ t (AR(p) process) and y t = ǫ t θ 1 ǫ t 1 θ 2 ǫ t 2... θ q ǫ t q (MA(q) process) The ϕ i and θ j, i = 1,..., p; j = 1,..., q are parameters. 26

28 Autoregressive Moving Average Processes: ARMA(p,q) Combining the AR(p) and MA(q) processes we define an autoregressive moving average process ARMA(p,q): y t = ϕ 1 y t 1 +ϕ 2 y t ϕ p y t p +ǫ t θ 1 ǫ t 1 θ 2 ǫ t 2... θ q ǫ t q. Using the lag operator B, the ARMA(p,q) model may be written: (1 ϕ 1 B ϕ 2 B 2... ϕ p B p )y t = (1 θ 1 B θ 2 B 2... θ q B q )ǫ t or more compactly as: ϕ(b)y t = θ(b)ǫ t, where ϕ(z) = 1 ϕ 1 z ϕ 2 z 2... ϕ p z p, θ(z) = 1 θ 1 z θ 2 z 2... θ q z q. 27

29 x ARMA(1,1): +0.5, Time True ACF True ACF Lag 28

30 Stationarity Conditions Assume that the polynomials θ(z) and ϕ(z) have no common zeroes. An ARMA(p,q) model defined by ϕ(b)y t = θ(b)ǫ t is stationary if ϕ(z) = (1 ϕ 1 z ϕ 2 z 2... ϕ p z p ) 0, for z 1. For a stationary ARMA(p,q) process the polynomial ϕ(b) can be inverted and so y t has a moving average representation of infinite order: y t = j=0 ψ j ǫ t j, (2) where the coefficients ψ j are determined by the relation θ(z) ϕ(z) = j=0 ψ j z j, z 1. We write equation (2) in compact form: y t = ϕ 1 (B)θ(B)ǫ t. 29

31 Invertibility Conditions The ARMA(p,q) model ϕ(b)y t = θ(b)ǫ t is called invertible if there exists a sequence of constants {π j } such that j=0 πj < and ǫ t = j=0 π j y t j. (3) Assume that the polynomials θ(z) and ϕ(z) have no common zeroes. Then the ARMA(p,q) process is invertible if and only if θ(z) = (1 θ 1 z θ 2 z 2... θ q z q ) 0, for z 1. The coefficients π j are determined by the relation ϕ(z) θ(z) = j=0 π j z j, z 1. We write equation (3) in the following compact form: θ 1 (B)ϕ(B)y t = ǫ t. 30

32 Polynomial Function vs. B Roots and Unit Circle Function Imaginary Part B Real Part Polynomial Function vs. B Roots and Unit Circle Function Imaginary Part B Real Part Polynomial Function vs. B Roots and Unit Circle Function Imaginary Part B Real Part 31

33 Non-zero mean ARMA processes For ARMA models we have so far assumed a zero-mean stationary process. The generalisation of stationary non-zero constant mean ARMA(p,q) is straightforward: Augmenting the stationary process with an additional parameter ν 0 one obtains: ϕ(b)y t = ν + θ(b)ǫ t. Inversion of ϕ(b) then immediately yields the mean of y t as: µ = E(y t ) = ϕ 1 (B)ν. Note that if ϕ(b) = 1 (which is the case for the pure MA(q) model) one has µ = ν. 32

34 Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33

35 ARMA model identification A plot of the data will give us some clue as to whether the series is not stationary. To analyse an observed stationary time series through an ARMA(p,q) model, the first step is to determine appropriate values for p and q. One of the basic tools in such model order identification are plots of the estimated ACF ˆρ k and PACF ˆα k against the lag k. The shape of these plots can help to discriminate between competing models. 34

36 ARMA model identification (cont.) The autocorrelations: for a MA(q) process ρ k = 0 for k q + 1; for an AR(p) process they decay exponentially. for a mixed ARMA(p,q) we expect the correlations to tail off after lag p q. These considerations assist in deciding whether p > 0 and, if not, to choose the value of q. 35

37 Estimators for ACF/PACF (see Ch. 7 in ref 3) Let (y 1, y 2,..., y N ) be a realization of a weak stationary time series. The sample autocovariance function is defined by ˆγ k = 1 N N k t=1 (y t ȳ)(y t+k ȳ) 0 k N, ˆγ k = ˆγ k, N < h 0, where ȳ is the sample mean ȳ = 1 N N j=1 y j. The sample autocorrelation function is defined by ˆρ k = ˆγ k ˆγ 0, k < N. 36

38 Estimators ACF/PACF (cont.) The sample PACF at lag k can be computed as a function of the sample estimate of the ACF as: ˆα k = ˆψ kk, k 1, where ˆψ kk is uniquely determined by: ˆρ 0 ˆρ 1 ˆρ 2... ˆρ k 1 ˆρ 1. ˆρ 0 ˆρ 1... ˆρ k 2. ˆρ k 1 ˆρ k 2 ˆρ k 3... ˆρ 0 ˆψ k1 ˆψ k2. ˆψ kk = ˆρ 1 ˆρ 2. ˆρ k. 37

39 AR(1): +0.5 Series x Series x x ACF Partial ACF :length(x) Lag Lag AR(1): 0.5 Series x Series x x 2 2 ACF Partial ACF :length(x) Lag Lag AR(2): +0.5, +0.3 Series x Series x x ACF Partial ACF :length(x) Lag Lag AR(2): 0.5, +0.3 Series x Series x x ACF Partial ACF :length(x) Lag Lag 38

40 MA(1): +0.8 Series x Series x x ACF Partial ACF :length(x) Lag Lag MA(1): 0.8 Series x Series x x ACF Partial ACF :length(x) Lag Lag ARMA(1,1): 0.5, 0.8 Series x Series x x 2 2 ACF Partial ACF :length(x) Lag Lag ARMA(1,1): 0.5, 0.8 Series x Series x x ACF Partial ACF :length(x) Lag Lag 39

41 AR(2): +0.5, 0.3 AR(2): 0.5, 0.3 x :length(x) Series x ACF 0.0 x :length(x) Series x ACF Lag Lag True ACF True ACF True ACF True ACF Lag Lag Series x Series x Partial ACF Partial ACF Lag Lag True PACF True PACF True PACF True PACF Lag Lag 40

42 ARMA Parameter estimation Fitting an ARMA(p,q) model requires estimation of: the model parameters (ϕ 1,..., ϕ p );(θ 1,..., θ q ); the mean µ (where this is non-zero) and the variance, σǫ 2, of the underlying white noise process ǫ t. If we denote the full set of these parameters by a vector Θ then we can proceed: to write down a likelihood for the data L(Θ;y) = p(y;θ), estimate the parameters by maximum likelihood and derive standard errors and confidence intervals through the asymptotic likelihood theory results. 41

43 ARMA Parameter estimation (cont.) The usual way to proceed is to assume that ǫ t N(0, σ 2 ǫ ). The resulting derivation of the likelihood function and the associated maximisation algorithm for the general ARMA(p,q) model is somewhat involved and we do not go into details here. The basic idea is to factorise the joint distribution p(y 1, y 2,..., y N ) as p(y 1, y 2,..., y N ) = p(y 1 ) N p(y t y 1,..., y t 1 ). t=2 It may then be shown that p(y t y 1,..., y t 1 ) is normal with mean given by the predicted value ŷ t of y t and similarly that the marginal distribution p(y 1 ) is normal with mean ŷ 1. Then log likelihood can then be expressed in terms of the prediction errors (y t ŷ t ). This assists in developing algorithms to effect the maximisation. 42

44 ARMA Model Selection We want to find a model that fits the observed data as well as possible. Once fitted, models can then be compared by the use of a suitable penalised log-likelihood measure, for example Akaike s Information Criterion (AIC) There exists a variety of other selection criteria that have been suggested to choose an appropriate model. All these are similar differing only in the penalty adjustment involving the number of estimated parameters. As for the AIC, the criteria are generally arranged so that better fitting models correspond to lower values of the criteria. 43

45 ARMA Model checking The residuals for an ARMA model are estimated by subtraction of the adopted model predictions from the observed time series. If the model assumptions are valid then we would expect the (standard) residuals to be independent and normally distributed. In time series analysis it is important to check that there is no autocorrelation remaining in the residuals. Plots of residuals against the time ordering are therefore important. Various tests for serial correlation in the residuals are available. 44

46 Ex. 4 AR(5), 0.4,0.1,0,0,0.1 x :length(x) Series x Partial ACF Lag 45

47 Example 5 The function armafit() estimates the parameters of ARMA models (arguments are described on the help page). Consider the time series generated in Ex 4. from an AR(5) model with parameters: ϕ 1 = 0.4, ϕ 2 = 0.1, ϕ 3 = ϕ 4 = 0, ϕ 5 = 0.1. Examination of the PACF (see above) reveals significant correlation at lag 5, after which the correlation is negligible. This suggests to use an ARMA(p,q) model with p = 5, with q 1 or 2 (this is because the PACF of an MA(q) decreases exponentialy). We first apply the function armafit() to estimate the parameters of an AR(5) model. 46

48 Example 5 (cont) fit<-armafit(x~ar(5),x,method="mle") summary(fit) Model: ARIMA(5,0,0) with method: CSS-ML Coefficient(s): ar1 ar2 ar3 ar4 ar5 intercept Residuals: Min 1Q Median 3Q Max Moments: Skewness Kurtosis

49 Example 5 (cont) Coefficient(s): Estimate Std. Error t value Pr(> t ) ar < 2e-16 *** ar ** ar ar ar e-06 *** intercept * --- Signif. codes: 0 *** ** 0.01 * sigma^2 estimated as: log likelihood: AIC Criterion:

50 Example 5 (cont) Note that summary() also provides the estimate of the variance σ 2 of the white noise process. The values of the AR coefficients of order 3 and 4 are small and the associated standard errors are large: as a consequence, these coefficients have large p-values (last column) and are not statistically significant according to a 5% t-test. It is therefore a good idea to fit an AR(5) process in which these coefficients (as well as the intercept) are fixed to zero. This can be specified with the parameter fixed=c(): 49

51 Example 5 (cont.) fit<-armafit(x~ar(5),x,fixed=c(na,na,0,0,na,0),method="mle") par(mfrow=c(2,2)) summary(fit) Model: ARIMA(5,0,0) with method: CSS-ML Coefficient(s): ar1 ar2 ar3 ar4 ar5 intercept Residuals: Min 1Q Median 3Q Max

52 Example 5 (cont) Moments: Skewness Kurtosis Coefficient(s): Estimate Std. Error t value Pr(> t ) ar < 2e-16 *** ar *** ar ar ar e-05 *** intercept Signif. codes: 0 *** ** 0.01 * sigma^2 estimated as: log likelihood: AIC Criterion:

53 Standardized Residuals Index QQ Plot of Residuals Normal Quantiles Residual Quantiles Residuals p value ACF ACF of Residuals Lag Ljung Box p values lag 52

54 See Ex. 5 (cont) The summary() method automatically plots the residuals, the autocorrelation function of the residuals, the standardized residuals, and the Ljung-Box statistic (test of independence). In order to investigate the model fit we could estimate the parameters for various ARMA(p,q) models with p max = 5 and q max = 2 for the same simulated time series and compare the relative fits through the AIC value (see the R script ex5.r). 53

55 Modeling with ARMA(p,q) models (summary) Model identification: Use the ACF and the PACF function to get indicators of p and q. The following can assist you to do that: Procces ACF Partial ACF AR(p) Exp. decay or damped cos zero after lag p MA(q) Cuts after lag q Exp. decay or damped cos ARM(p,q) Exponential decay after q p Decay after p q Parameter estimation: Estimate values for the model parameters (ϕ 1,..., ϕ p );(θ 1,..., θ q ), µ and σǫ 2. (there are several ways one can do this e.g. the Yule-Walker method). 54

56 Modeling with ARMA(p,q) models (summary cont.) Model selection: Fit ARMA(p,q) models by the maximum Likelihood estimates using the (Yule-Walke) estimates for the parameter as initial values of the maximisation algorithm. Prevent over-fitting by imposing a cost for increasing the number of parameters in the fitted model. One way in which this can be done is by the information criterion of Akaike (AIC) The model selected is the one that minimises the value of AIC. 55

57 Model checking The residuals of a fitted model are the scaled difference between an observed and a predicted value. Goodness of fit is checked essentially by checking that the residuals are like white noise (i.e. mean zero i.i.d. process with constant variance). There are several candidates for the residuals one is the computed in the course of determining the maximum likelihood estimates: Ŵ t = Y t Ŷ t (ˆϕ, ˆθ)) r t 1 (ˆϕ, ˆθ) 1/2, where Ŷ t (ˆϕ, ˆθ) are the predicted values of Y t, based on Y 1,..., Y N 1 for the fitted ARMA(p,q) model and r t 1 (ˆϕ, ˆθ) 1/2 are the sample mean squared errors. Another is: Ẑ t = ˆθ 1 (B)(ˆϕ)(B)Y t. 56

58

59 Forecasting from ARMA models Given a series (y 1, y 2,..., y N ) up to time N, a prominent issue within time series analysis is: to provide estimates of future values y N+h, h = 1,2,... conditionally on the available information, i.e. y N, y N 1, y N 2,.... Within the class of weak stationary ARMA(p,q) processes y N+h is given by: y N+h = ν + ϕ 1 y N+h ϕ p y N+h p + ǫ N+h θ 1 ǫ N+h 1... θ q ǫ N+h q ( ) 57

60 Forecasting from ARMA models (cont.) An obvious forecast for y N+h is ŷ N+h = E [ y N+h y N, y N 1, y N 2,... ] i.e. its expected value given the observed series. The computation of this expectation follows a recursive scheme of substituting: { yn+j, j 0 ŷ N+j = ŷ N+j, j > 0 into equation ( ) in place of y N+j and taking ǫ N+j = 0 for j > 0. 58

61 Forecasting from ARMA models (cont.) For example for the ARMA(1,1) model with a non-zero mean, equation ( ) is: y N+h = ν + ϕ 1 y N+h 1 + ǫ N+h θ 1 ǫ N+h 1 so we obtain successively: ŷ N+1 = ν + ϕ 1 y N θ 1 ǫ N ŷ N+2 = ν + ϕ 1 ŷ N+1 = ν + ϕ 1 (ν + ϕ 1 y N θ 1 ǫ N ) ŷ N+3 = ν + ϕ 1 ŷ N+2 = ν + ϕ 1 (ν + ϕ 1 (ν + ϕ 1 y N θ 1 ǫ N )).. Iterating this scheme shows that with increased forecast horizon the forecast converges to the mean of the process µ. 59

62 Forecasting from ARMA models (cont.) Obtaining the sequence of forecast errors ˆǫ N+h = y N+h ŷ N+h follows the same sort of scheme so that: ˆǫ N+1 = y N+h ŷ N+h = ν + ϕ 1 y N + ǫ N+1 θ 1 ǫ N (ν + ϕ 1 y N θ 1 ǫ N ) = ǫ N+1 Iterating along similar lines we obtain: ˆǫ N+2 = ǫ N+2 + (ϕ 1 θ 1 )ǫ N+1 ˆǫ N+3 = ǫ N+3 + (ϕ 1 θ 1 )ǫ N+3 + ϕ 1 (ϕ 1 θ 1 )ǫ N+1 and so on. 60

63 Forecasting from ARMA models (cont.) The forecasts ŷ N+h are unbiased and so the expected values of the forecast errors ˆǫ N+h are zero. The variance of the forecast error however increase with h. In the limit as h increases this variance converges to the unconditional variance of the process i.e. var(y t ) = σ 2 = γ 0. Clearly in practical forecasting from an ARMA(p,q) model the values of the parameters (ϕ 1,..., ϕ p ) and (θ 1,..., θ q ) will be unknown and these are replaced by their maximum likelihood estimates. Standard errors and confidence intervals for the forecasts may be derived from the general likelihood theory in the usual way. See Ex. 6 61

64 ARIMA(5,0,0) with method: CSS ML Series: x Time 62

65 Non stationary processes Many time series encountered in practice may exhibit non-stationary behaviour. For example, there maybe non-stationarity in the mean component e.g. a time trend or seasonal effect in µ t. We may think of this situation as the series consisting of a nonconstant systematic (trend) component (usually some relatively simple function of time) and then a random component which is a zero-mean stationary series. Note that such a model is only reasonable if there are good reasons for believing that the trend is appropriate forever. There are several methods to eliminate trend and seasonal effects to generate stationary data. 63

66 ARIMA models Some types of time series the non-stationary behaviour of the mean µ t is simple enough so that some differencing of the original series yields a new series which is stationary (so µ t is constant). For example for financial time series (comprising log prices), first differencing (log returns) is often sufficient to produce a stationary time series with a constant mean. So the differenced series can be modelled directly by an ARMA process and no additional systematic component is required. This type of time series modelling where some degree of differencing is combined with an ARMA model is called Auto-regressive Integrated Moving Average (ARIMA) modelling. 64

67 ARIMA models (cont.) We have seen already that if the moduli of the roots of the characteristic equation of an ARMA(p,q) model lie inside the unit circle then the process will not be stationary. In general, if the modulus of a root is strictly inside the unit circle then this will lead to exponential or explosive behaviour in the series and no practical models result. If the modulus of the offending root lies on the circle then a more reasonable type of non-stationarity results. For example for the simple random walk y t = y t 1 + ǫ t. Note that the first difference of this series y t y t 1 is a white noise process. 65

68 ARIMA models (cont.) This differencing idea can be generalised to the notion of using a model Y t where the first difference of the process X t = (1 B)Y t = Y t 1 Y t is a stationary ARMA process, rather than white noise. More generally, if d 1, Y t is an ARIMA(p,d,q) process if X t = (1 B) d Y t is an ARMA(p,q). An ARIMA(p,d,q) process Y t satisfies: ϕ (B)Y t ϕ(b)(1 B) d Y t = θ(b)ǫ t, where ϕ(z) and θ(z) are polynomials of degrees p and q, resp., and ϕ(z) 0 for z 1 and ǫ t is a white noise process. An ARIMA model for a series y t is one where a differencing operation on y t leads to a series with stationary ARMA behaviour. 66

69 ARIMA models (cont.) A distinctive feature of the data which suggest the appropriateness of an ARIMA model is the slowly decaying positive sample ACF. Sample ACF with slowly decaying oscillatory behaviour are associated with models ϕ (B)Y t = θ(b)ǫ t, in which ϕ has a zero near e iα for some α ( π, π] other than α = 0. In modeling using ARIMA processes the original series is simply differenced until stationarity is obtained and then the differenced series is modelled following the standard ARMA approach. 67

70 ARIMA models (cont.) Results may then be transformed back to the undifferenced original scale if required. Choice of an appropriate differencing parameter adds an extra dimension to model choice. For financial time series that have non-stationary behaviour, as mentioned earlier, first differencing (which leads to use of log returns), is usually sufficient to produce a time series with a stationary mean. 68

71 ARIMA models (summary) Plot the data to determine whether there is a trend. Of course this is only an indication, and what we see as a trend may be part of a very long term circle. Use the sample ACF and PACF to determine whether it is possible to model the time series with an ARIMA model. Use differences to obtain an ARMA model. Model the differenced data using ARMA modelling. See Ex. 7 69

72 ARCH and GARCH Modelling ARMA and ARIMA modelling is quite flexible and applicable. However, in some financial time series there are effects which cannot be adequately explained by these sorts of models. One particular feature is so called volatility clustering. This refers to a tendency for the variance of the random component to be large if the magnitude of recent errors has been large and smaller if the magnitude of recent errors has been small. This kind of behaviour requires non-stationarity in variance (i.e. heteroscedasticity) rather than in the mean This leads to alternative kinds of models to the ARIMA family which are referred to as ARCH and GARCH models. 70

73 ARCH and GARCH Modelling A dominant feature in many financial series is volatility clustering: The conditional variance of ǫ t appears to be large if recent observations ǫ t 1, ǫ t 2,.. are large in absolute value and small during periods where lagged innovations are also small in absolute value. This effect cannot be explained by ARIMA models which assume a constant variance. Autoregressive Conditionally Heteroscedastic(ARCH) models, (Engle 1982), were developed to model changes in volatility. These were extended to Generalised ARCH, or (GARCH) models (Bollerslev 1986). 71

74 ARCH Models Let x t be the value of a stock at time t. The return, or relative gain, y t, of the stock at time t is y t = x t x t 1 x t 1. Note, for financial series, return does not have a constant variance, with highly volatile periods tending to be clustered together there is a strong dependence of sudden bursts of variability in a return on the time series own past. Volatility models like ARCH, GARCH are used to study the returns y t. 72

75 ARCH(1) Models The most simple ARCH model, the ARCH(1), models the return as where ǫ t N(0,1). y t = σ t ǫ t σt 2 = ω + α 1 yt 1 2, As with ARMA models, we impose constraints on the model parameters to obtain desirable properties: Sufficient conditions that guarantee σ 2 t > 0 are ω > 0, α

76 ARCH(1)(Properties) Conditionally on y t 1, y t is Gaussian: y t y t 1 N ( 0, ω + α 1 y 2 t 1). The returns {y t } have zero mean and they are uncorrelated. The squared returns { yt 2 } satisfy: yt 2 = ω + α 1yt v t, where the error process v t = σt 2 ( ǫ 2 t 1 ) is a white noise process. Hence ARCH(1) models returns {y t } as a white noise process with non-constant conditional variance, and the conditional variance depends on the previous return. the returns {y t } are uncorrelated, whereas their squares { yt 2 follow a non-gaussian autoregressive process. } 74

77 ARCH(1) Models (cont.) Moreover, the kurtosis of y t is ] κ = E [ yt 4 E [ yt 2 ] 2 = 3( 1 α2 1 ) 1 3α 2 1 which is always larger than 3, the kurtosis of the normal distribution. Thus, the marginal distribution of the returns, y t, is leptokurtic, or has heavy tails. So outliers are more likely. This agrees with empirical evidence - outliers appear more often in asset returns than implied by an i.i.d sequence of normal random variates. 75

78 ARCH(1) Models (cont.) Estimation of the parameters ω and α 1 of the ARCH(1) model is accomplished using conditional MLE. The likelihood of the data y 2,..., y n conditional on y 1, is given by where L(ω, α 1 y 1 ) = n t=2 f ω,α1 ( yt y t 1 ), f ω,α1 ( yt y t 1 ) N ( 0, ω + α1 y 2 t 1), that is ( ) 1 f ω,α1 yt y t 1 ( exp ω + α1 yt 1 2 ) y 2 t ω + α 1 y 2 t 1. 76

79 ARCH(1) Models (cont.) Hence, the objective function to be maximised is the conditional log-likelihood l(ω, α 1 y 1 ) = ln[l(ω, α 1 y 1 )] n t=2 n t=2 ln ( ω + α 1 y 2 t 1) y 2 t ω + α 1 y 2 t 1. Maximisation of this function is achieved using numerical methods (analytic expressions for the gradient vector and Hessian matrix of the log-likelihood functions can be obtained). 77

80 ARCH(m) Models (cont.) The general ARCH(m) model is defined by: y t = σ t ǫ t σt 2 = ω + α 1 yt α myt m 2, where the parameter m determines the maximum order of lagged innovations which are supposed to have an impact on current volatility. Similar results to those from the ARCH(1) model hold: y t y t 1,..., y t m N ( 0, ω + α 1 y 2 t α my 2 t m), yt 2 = ω + α 1 yt α 1yt α myt m 2 + v t, where v t = σt 2 ( ǫ 2 t 1 ) is a shifted χ 2 1 random variable. y t and v t have a zero mean. Estimation of the parameters ω, α 1,..., α m is similar to that for ARCH(1) 78

81 Building ARCH models An ARIMA model is built for the observed time series to remove any serial correlation in the data. Examine the squared residuals to check for conditional heteroscedasticity. Use the PACF of squared residuals to determine the ARCH order. As final remarks we should comments on some of the weaknesses: ARCH treats positive and negative returns in the same way (by past square returns). ARCH often over-predicts the volatility, because it responds slowly to large shocks. 79

82 GARCH(m,r) models Generalised ARCH models, GARCH (m,r) process (Boyerslev, 1982) are obtained by augmenting σ 2 t with a component autoregressive in σ 2 t. For instance, a GARCH(1,1) model is y t = σ t ǫ t σt 2 = ω + α 1 yt β 1σt 1 2. Assuming α 1 + β 1 < 1 and using similar manipulations as before, it can be shown that the GARCH(1,1) model admits a non-gaussian ARMA(1,1) model for the squared process. Indeed: y 2 t = σ 2 t ǫ2 t ω + α 1 y 2 t 1 + β 1σ 2 t 1 = σ2 t 80

83 GARCH(m,r) models (cont) It can be seen that: y 2 t σ 2 t = σ 2 t ( ǫ 2 t 1 ) = v t = y 2 t 1 σ2 t 1 = σ2 t 1 ( ǫ 2 t 1 1 ) = v t 1 and then y 2 t ω α 1 y 2 t 1 β 1σ 2 t 1 = v t = y 2 t = ω + α 1y 2 t 1 + β 1y 2 t 1 + β 1 ( σ 2 t 1 y 2 t 1) + vt and so y 2 t = ω + (α 1 + β 1 ) y 2 t 1 β 1v t 1 + v t. 81

84 GARCH(m,r) models (cont) In general, the GARCH (m,r) model is y t = σ t ǫ t σt 2 = ω + α 1 yt α myt m 2 + β 1σt β rσt r 2. Sufficient conditions for the conditional variance to be positive are obvious: ω > 0, α i, β j 0, i = 1,..., m; j = 1,..., r. Using polynomials in the lag B, the specification of σ 2 t may also be given by or (1 β 1 B... β r B r ) σ 2 t = ω + (α 1B α m B r ) y t (1 β (B)) σ 2 t = ω + α(b) y t. 82

85 GARCH(m,r) models (cont) Assuming the zeros of the polynomial (1 β (z)) are larger than one in absolute value, the model can also be written as an ARCH process of infinite order: σ 2 t = (1 β (B)) 1 ω + (1 β (B)) 1 α(b) y t. Note that a GARCH(m,r) admits a non-gaussian ARMA(m,r) model for the squared process: y 2 t = ω + max(m,r) i=1 (α i + β i ) y 2 t i + v t r i=1 β i y 2 t i. Building and fitting GARCH models follows similarly to that discussed previously for ARCH models. See Ex. 8 83

### ITSM-R Reference Manual

ITSM-R Reference Manual George Weigt June 5, 2015 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

### Univariate Time Series Analysis; ARIMA Models

Econometrics 2 Spring 25 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

### Time Series Analysis

JUNE 2012 Time Series Analysis CONTENT A time series is a chronological sequence of observations on a particular variable. Usually the observations are taken at regular intervals (days, months, years),

### Some useful concepts in univariate time series analysis

Some useful concepts in univariate time series analysis Autoregressive moving average models Autocorrelation functions Model Estimation Diagnostic measure Model selection Forecasting Assumptions: 1. Non-seasonal

### Time Series Analysis

Time Series Analysis Identifying possible ARIMA models Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 2012 Alonso and García-Martos

### Time Series - ARIMA Models. Instructor: G. William Schwert

APS 425 Fall 25 Time Series : ARIMA Models Instructor: G. William Schwert 585-275-247 schwert@schwert.ssb.rochester.edu Topics Typical time series plot Pattern recognition in auto and partial autocorrelations

### Sales forecasting # 2

Sales forecasting # 2 Arthur Charpentier arthur.charpentier@univ-rennes1.fr 1 Agenda Qualitative and quantitative methods, a very general introduction Series decomposition Short versus long term forecasting

### TIME SERIES ANALYSIS

TIME SERIES ANALYSIS L.M. BHAR AND V.K.SHARMA Indian Agricultural Statistics Research Institute Library Avenue, New Delhi-0 02 lmb@iasri.res.in. Introduction Time series (TS) data refers to observations

### Booth School of Business, University of Chicago Business 41202, Spring Quarter 2015, Mr. Ruey S. Tsay. Solutions to Midterm

Booth School of Business, University of Chicago Business 41202, Spring Quarter 2015, Mr. Ruey S. Tsay Solutions to Midterm Problem A: (30 pts) Answer briefly the following questions. Each question has

### Univariate and Multivariate Methods PEARSON. Addison Wesley

Time Series Analysis Univariate and Multivariate Methods SECOND EDITION William W. S. Wei Department of Statistics The Fox School of Business and Management Temple University PEARSON Addison Wesley Boston

### Univariate Time Series Analysis; ARIMA Models

Econometrics 2 Fall 25 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Univariate Time Series Analysis We consider a single time series, y,y 2,..., y T. We want to construct simple

### Time Series Analysis 1. Lecture 8: Time Series Analysis. Time Series Analysis MIT 18.S096. Dr. Kempthorne. Fall 2013 MIT 18.S096

Lecture 8: Time Series Analysis MIT 18.S096 Dr. Kempthorne Fall 2013 MIT 18.S096 Time Series Analysis 1 Outline Time Series Analysis 1 Time Series Analysis MIT 18.S096 Time Series Analysis 2 A stochastic

### TIME SERIES ANALYSIS

TIME SERIES ANALYSIS Ramasubramanian V. I.A.S.R.I., Library Avenue, New Delhi- 110 012 ram_stat@yahoo.co.in 1. Introduction A Time Series (TS) is a sequence of observations ordered in time. Mostly these

### 1 Short Introduction to Time Series

ECONOMICS 7344, Spring 202 Bent E. Sørensen January 24, 202 Short Introduction to Time Series A time series is a collection of stochastic variables x,.., x t,.., x T indexed by an integer value t. The

### Time Series Analysis

Time Series 1 April 9, 2013 Time Series Analysis This chapter presents an introduction to the branch of statistics known as time series analysis. Often the data we collect in environmental studies is collected

### Time Series Analysis

Time Series Analysis Forecasting with ARIMA models Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 2012 Alonso and García-Martos (UC3M-UPM)

### Time Series Analysis

Time Series Analysis Autoregressive, MA and ARMA processes Andrés M. Alonso Carolina García-Martos Universidad Carlos III de Madrid Universidad Politécnica de Madrid June July, 212 Alonso and García-Martos

### Time Series Analysis

Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Identification of univariate time series models, cont.:

### Lecture 2: ARMA(p,q) models (part 3)

Lecture 2: ARMA(p,q) models (part 3) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

### 3.1 Stationary Processes and Mean Reversion

3. Univariate Time Series Models 3.1 Stationary Processes and Mean Reversion Definition 3.1: A time series y t, t = 1,..., T is called (covariance) stationary if (1) E[y t ] = µ, for all t Cov[y t, y t

### Chapter 4: Vector Autoregressive Models

Chapter 4: Vector Autoregressive Models 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and und Econometrics Ökonometrie IV.1 Vector Autoregressive Models (VAR)...

### Luciano Rispoli Department of Economics, Mathematics and Statistics Birkbeck College (University of London)

Luciano Rispoli Department of Economics, Mathematics and Statistics Birkbeck College (University of London) 1 Forecasting: definition Forecasting is the process of making statements about events whose

Chapter 7 Chapter Table of Contents OVERVIEW...193 GETTING STARTED...194 TheThreeStagesofARIMAModeling...194 IdentificationStage...194 Estimation and Diagnostic Checking Stage...... 200 Forecasting Stage...205

### Graphical Tools for Exploring and Analyzing Data From ARIMA Time Series Models

Graphical Tools for Exploring and Analyzing Data From ARIMA Time Series Models William Q. Meeker Department of Statistics Iowa State University Ames, IA 50011 January 13, 2001 Abstract S-plus is a highly

### Traffic Safety Facts. Research Note. Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen

Traffic Safety Facts Research Note March 2004 DOT HS 809 718 Time Series Analysis and Forecast of Crash Fatalities during Six Holiday Periods Cejun Liu* and Chou-Lin Chen Summary This research note uses

### Discrete Time Series Analysis with ARMA Models

Discrete Time Series Analysis with ARMA Models Veronica Sitsofe Ahiati (veronica@aims.ac.za) African Institute for Mathematical Sciences (AIMS) Supervised by Tina Marquardt Munich University of Technology,

### Time Series Analysis and Forecasting

Time Series Analysis and Forecasting Math 667 Al Nosedal Department of Mathematics Indiana University of Pennsylvania Time Series Analysis and Forecasting p. 1/11 Introduction Many decision-making applications

### Vector Time Series Model Representations and Analysis with XploRe

0-1 Vector Time Series Model Representations and Analysis with plore Julius Mungo CASE - Center for Applied Statistics and Economics Humboldt-Universität zu Berlin mungo@wiwi.hu-berlin.de plore MulTi Motivation

### 9th Russian Summer School in Information Retrieval Big Data Analytics with R

9th Russian Summer School in Information Retrieval Big Data Analytics with R Introduction to Time Series with R A. Karakitsiou A. Migdalas Industrial Logistics, ETS Institute Luleå University of Technology

### ADVANCED FORECASTING MODELS USING SAS SOFTWARE

ADVANCED FORECASTING MODELS USING SAS SOFTWARE Girish Kumar Jha IARI, Pusa, New Delhi 110 012 gjha_eco@iari.res.in 1. Transfer Function Model Univariate ARIMA models are useful for analysis and forecasting

### ARMA, GARCH and Related Option Pricing Method

ARMA, GARCH and Related Option Pricing Method Author: Yiyang Yang Advisor: Pr. Xiaolin Li, Pr. Zari Rachev Department of Applied Mathematics and Statistics State University of New York at Stony Brook September

### Studying Achievement

Journal of Business and Economics, ISSN 2155-7950, USA November 2014, Volume 5, No. 11, pp. 2052-2056 DOI: 10.15341/jbe(2155-7950)/11.05.2014/009 Academic Star Publishing Company, 2014 http://www.academicstar.us

### Analysis of algorithms of time series analysis for forecasting sales

SAINT-PETERSBURG STATE UNIVERSITY Mathematics & Mechanics Faculty Chair of Analytical Information Systems Garipov Emil Analysis of algorithms of time series analysis for forecasting sales Course Work Scientific

### Prediction of Stock Price usingautoregressiveintegrated Moving AverageFilter Arima P,D,Q

Global Journal of Science Frontier Research Mathematics and Decision Sciences Volume 13 Issue 8 Version 1.0 Year Type : Double Blind Peer Reviewed International Research Journal Publisher: Global Journals

### Rob J Hyndman. Forecasting using. 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1

Rob J Hyndman Forecasting using 11. Dynamic regression OTexts.com/fpp/9/1/ Forecasting using R 1 Outline 1 Regression with ARIMA errors 2 Example: Japanese cars 3 Using Fourier terms for seasonality 4

### Time Series Analysis

Time Series Analysis Andrea Beccarini Center for Quantitative Economics Winter 2013/2014 Andrea Beccarini (CQE) Time Series Analysis Winter 2013/2014 1 / 143 Introduction Objectives Time series are ubiquitous

### Time Series in Mathematical Finance

Instituto Superior Técnico (IST, Portugal) and CEMAT cnunes@math.ist.utl.pt European Summer School in Industrial Mathematics Universidad Carlos III de Madrid July 2013 Outline The objective of this short

### Time Series Analysis in Economics. Klaus Neusser

Time Series Analysis in Economics Klaus Neusser May 26, 2015 Contents I Univariate Time Series Analysis 3 1 Introduction 1 1.1 Some examples.......................... 2 1.2 Formal definitions.........................

### Using JMP Version 4 for Time Series Analysis Bill Gjertsen, SAS, Cary, NC

Using JMP Version 4 for Time Series Analysis Bill Gjertsen, SAS, Cary, NC Abstract Three examples of time series will be illustrated. One is the classical airline passenger demand data with definite seasonal

### Software Review: ITSM 2000 Professional Version 6.0.

Lee, J. & Strazicich, M.C. (2002). Software Review: ITSM 2000 Professional Version 6.0. International Journal of Forecasting, 18(3): 455-459 (June 2002). Published by Elsevier (ISSN: 0169-2070). http://0-

### Threshold Autoregressive Models in Finance: A Comparative Approach

University of Wollongong Research Online Applied Statistics Education and Research Collaboration (ASEARC) - Conference Papers Faculty of Informatics 2011 Threshold Autoregressive Models in Finance: A Comparative

### Statistics and Probability Letters

Statistics and Probability Letters 79 (2009) 1884 1889 Contents lists available at ScienceDirect Statistics and Probability Letters journal homepage: www.elsevier.com/locate/stapro Discrete-valued ARMA

### Forecasting of Paddy Production in Sri Lanka: A Time Series Analysis using ARIMA Model

Tropical Agricultural Research Vol. 24 (): 2-3 (22) Forecasting of Paddy Production in Sri Lanka: A Time Series Analysis using ARIMA Model V. Sivapathasundaram * and C. Bogahawatte Postgraduate Institute

### Forecasting model of electricity demand in the Nordic countries. Tone Pedersen

Forecasting model of electricity demand in the Nordic countries Tone Pedersen 3/19/2014 Abstract A model implemented in order to describe the electricity demand on hourly basis for the Nordic countries.

### State Space Time Series Analysis

State Space Time Series Analysis p. 1 State Space Time Series Analysis Siem Jan Koopman http://staff.feweb.vu.nl/koopman Department of Econometrics VU University Amsterdam Tinbergen Institute 2011 State

### Estimating an ARMA Process

Statistics 910, #12 1 Overview Estimating an ARMA Process 1. Main ideas 2. Fitting autoregressions 3. Fitting with moving average components 4. Standard errors 5. Examples 6. Appendix: Simple estimators

### APPLICATION OF THE VARMA MODEL FOR SALES FORECAST: CASE OF URMIA GRAY CEMENT FACTORY

APPLICATION OF THE VARMA MODEL FOR SALES FORECAST: CASE OF URMIA GRAY CEMENT FACTORY DOI: 10.2478/tjeb-2014-0005 Ramin Bashir KHODAPARASTI 1 Samad MOSLEHI 2 To forecast sales as reliably as possible is

### Time Series Analysis

Time Series Analysis Lecture Notes for 475.726 Ross Ihaka Statistics Department University of Auckland April 14, 2005 ii Contents 1 Introduction 1 1.1 Time Series.............................. 1 1.2 Stationarity

### THE SVM APPROACH FOR BOX JENKINS MODELS

REVSTAT Statistical Journal Volume 7, Number 1, April 2009, 23 36 THE SVM APPROACH FOR BOX JENKINS MODELS Authors: Saeid Amiri Dep. of Energy and Technology, Swedish Univ. of Agriculture Sciences, P.O.Box

### Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 4: Seasonal Time Series, Trend Analysis & Component Model Bus 41910, Time Series Analysis, Mr. R. Tsay Business cycle plays an important role in economics. In time series analysis, business cycle

### Chapter 5. Analysis of Multiple Time Series. 5.1 Vector Autoregressions

Chapter 5 Analysis of Multiple Time Series Note: The primary references for these notes are chapters 5 and 6 in Enders (2004). An alternative, but more technical treatment can be found in chapters 10-11

### SYSTEMS OF REGRESSION EQUATIONS

SYSTEMS OF REGRESSION EQUATIONS 1. MULTIPLE EQUATIONS y nt = x nt n + u nt, n = 1,...,N, t = 1,...,T, x nt is 1 k, and n is k 1. This is a version of the standard regression model where the observations

### Java Modules for Time Series Analysis

Java Modules for Time Series Analysis Agenda Clustering Non-normal distributions Multifactor modeling Implied ratings Time series prediction 1. Clustering + Cluster 1 Synthetic Clustering + Time series

### 11. Time series and dynamic linear models

11. Time series and dynamic linear models Objective To introduce the Bayesian approach to the modeling and forecasting of time series. Recommended reading West, M. and Harrison, J. (1997). models, (2 nd

### Promotional Forecast Demonstration

Exhibit 2: Promotional Forecast Demonstration Consider the problem of forecasting for a proposed promotion that will start in December 1997 and continues beyond the forecast horizon. Assume that the promotion

### Forecasting Using Eviews 2.0: An Overview

Forecasting Using Eviews 2.0: An Overview Some Preliminaries In what follows it will be useful to distinguish between ex post and ex ante forecasting. In terms of time series modeling, both predict values

### Forecasting the US Dollar / Euro Exchange rate Using ARMA Models

Forecasting the US Dollar / Euro Exchange rate Using ARMA Models LIUWEI (9906360) - 1 - ABSTRACT...3 1. INTRODUCTION...4 2. DATA ANALYSIS...5 2.1 Stationary estimation...5 2.2 Dickey-Fuller Test...6 3.

### Time Series Analysis of Aviation Data

Time Series Analysis of Aviation Data Dr. Richard Xie February, 2012 What is a Time Series A time series is a sequence of observations in chorological order, such as Daily closing price of stock MSFT in

### IBM SPSS Forecasting 22

IBM SPSS Forecasting 22 Note Before using this information and the product it supports, read the information in Notices on page 33. Product Information This edition applies to version 22, release 0, modification

### Time Series Laboratory

Time Series Laboratory Computing in Weber Classrooms 205-206: To log in, make sure that the DOMAIN NAME is set to MATHSTAT. Use the workshop username: primesw The password will be distributed during the

### PITFALLS IN TIME SERIES ANALYSIS. Cliff Hurvich Stern School, NYU

PITFALLS IN TIME SERIES ANALYSIS Cliff Hurvich Stern School, NYU The t -Test If x 1,..., x n are independent and identically distributed with mean 0, and n is not too small, then t = x 0 s n has a standard

Lecture Notes on Univariate Time Series Analysis and Box Jenkins Forecasting John Frain Economic Analysis, Research and Publications April 1992 (reprinted with revisions) January 1999 Abstract These are

### Time Series Analysis: Basic Forecasting.

Time Series Analysis: Basic Forecasting. As published in Benchmarks RSS Matters, April 2015 http://web3.unt.edu/benchmarks/issues/2015/04/rss-matters Jon Starkweather, PhD 1 Jon Starkweather, PhD jonathan.starkweather@unt.edu

### Promotional Analysis and Forecasting for Demand Planning: A Practical Time Series Approach Michael Leonard, SAS Institute Inc.

Promotional Analysis and Forecasting for Demand Planning: A Practical Time Series Approach Michael Leonard, SAS Institute Inc. Cary, NC, USA Abstract Many businesses use sales promotions to increase the

### Statistics in Retail Finance. Chapter 6: Behavioural models

Statistics in Retail Finance 1 Overview > So far we have focussed mainly on application scorecards. In this chapter we shall look at behavioural models. We shall cover the following topics:- Behavioural

### Time Series HILARY TERM 2010 PROF. GESINE REINERT http://www.stats.ox.ac.uk/~reinert

Time Series HILARY TERM 2010 PROF. GESINE REINERT http://www.stats.ox.ac.uk/~reinert Overview Chapter 1: What are time series? Types of data, examples, objectives. Definitions, stationarity and autocovariances.

### Introduction to Time Series and Forecasting, Second Edition

Introduction to Time Series and Forecasting, Second Edition Peter J. Brockwell Richard A. Davis Springer Springer Texts in Statistics Advisors: George Casella Stephen Fienberg Ingram Olkin Springer New

### Chapter 9: Univariate Time Series Analysis

Chapter 9: Univariate Time Series Analysis In the last chapter we discussed models with only lags of explanatory variables. These can be misleading if: 1. The dependent variable Y t depends on lags of

### Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written

### Integrated Resource Plan

Integrated Resource Plan March 19, 2004 PREPARED FOR KAUA I ISLAND UTILITY COOPERATIVE LCG Consulting 4962 El Camino Real, Suite 112 Los Altos, CA 94022 650-962-9670 1 IRP 1 ELECTRIC LOAD FORECASTING 1.1

### Wooldridge, Introductory Econometrics, 3d ed. Chapter 12: Serial correlation and heteroskedasticity in time series regressions

Wooldridge, Introductory Econometrics, 3d ed. Chapter 12: Serial correlation and heteroskedasticity in time series regressions What will happen if we violate the assumption that the errors are not serially

### SPSS TRAINING SESSION 3 ADVANCED TOPICS (PASW STATISTICS 17.0) Sun Li Centre for Academic Computing lsun@smu.edu.sg

SPSS TRAINING SESSION 3 ADVANCED TOPICS (PASW STATISTICS 17.0) Sun Li Centre for Academic Computing lsun@smu.edu.sg IN SPSS SESSION 2, WE HAVE LEARNT: Elementary Data Analysis Group Comparison & One-way

### Forecasting methods applied to engineering management

Forecasting methods applied to engineering management Áron Szász-Gábor Abstract. This paper presents arguments for the usefulness of a simple forecasting application package for sustaining operational

### Time-Series Regression and Generalized Least Squares in R

Time-Series Regression and Generalized Least Squares in R An Appendix to An R Companion to Applied Regression, Second Edition John Fox & Sanford Weisberg last revision: 11 November 2010 Abstract Generalized

### Introduction to Time Series Analysis. Lecture 6.

Introduction to Time Series Analysis. Lecture 6. Peter Bartlett www.stat.berkeley.edu/ bartlett/courses/153-fall2010 Last lecture: 1. Causality 2. Invertibility 3. AR(p) models 4. ARMA(p,q) models 1 Introduction

### Analysis of the Volatility of the Electricity Price in Kenya Using Autoregressive Integrated Moving Average Model

Science Journal of Applied Mathematics and Statistics 2015; 3(2): 47-57 Published online March 28, 2015 (http://www.sciencepublishinggroup.com/j/sjams) doi: 10.11648/j.sjams.20150302.14 ISSN: 2376-9491

### USE OF ARIMA TIME SERIES AND REGRESSORS TO FORECAST THE SALE OF ELECTRICITY

Paper PO10 USE OF ARIMA TIME SERIES AND REGRESSORS TO FORECAST THE SALE OF ELECTRICITY Beatrice Ugiliweneza, University of Louisville, Louisville, KY ABSTRACT Objectives: To forecast the sales made by

### 4. Simple regression. QBUS6840 Predictive Analytics. https://www.otexts.org/fpp/4

4. Simple regression QBUS6840 Predictive Analytics https://www.otexts.org/fpp/4 Outline The simple linear model Least squares estimation Forecasting with regression Non-linear functional forms Regression

### Examples. David Ruppert. April 25, 2009. Cornell University. Statistics for Financial Engineering: Some R. Examples. David Ruppert.

Cornell University April 25, 2009 Outline 1 2 3 4 A little about myself BA and MA in mathematics PhD in statistics in 1977 taught in the statistics department at North Carolina for 10 years have been in

### COMP6053 lecture: Time series analysis, autocorrelation. jn2@ecs.soton.ac.uk

COMP6053 lecture: Time series analysis, autocorrelation jn2@ecs.soton.ac.uk Time series analysis The basic idea of time series analysis is simple: given an observed sequence, how can we build a model that

### Statistical Machine Learning

Statistical Machine Learning UoC Stats 37700, Winter quarter Lecture 4: classical linear and quadratic discriminants. 1 / 25 Linear separation For two classes in R d : simple idea: separate the classes

### Forecasting areas and production of rice in India using ARIMA model

International Journal of Farm Sciences 4(1) :99-106, 2014 Forecasting areas and production of rice in India using ARIMA model K PRABAKARAN and C SIVAPRAGASAM* Agricultural College and Research Institute,

### Chapter 1. Vector autoregressions. 1.1 VARs and the identi cation problem

Chapter Vector autoregressions We begin by taking a look at the data of macroeconomics. A way to summarize the dynamics of macroeconomic data is to make use of vector autoregressions. VAR models have become

### Energy Load Mining Using Univariate Time Series Analysis

Energy Load Mining Using Univariate Time Series Analysis By: Taghreed Alghamdi & Ali Almadan 03/02/2015 Caruth Hall 0184 Energy Forecasting Energy Saving Energy consumption Introduction: Energy consumption.

### Volatility modeling in financial markets

Volatility modeling in financial markets Master Thesis Sergiy Ladokhin Supervisors: Dr. Sandjai Bhulai, VU University Amsterdam Brian Doelkahar, Fortis Bank Nederland VU University Amsterdam Faculty of

### MGT 267 PROJECT. Forecasting the United States Retail Sales of the Pharmacies and Drug Stores. Done by: Shunwei Wang & Mohammad Zainal

MGT 267 PROJECT Forecasting the United States Retail Sales of the Pharmacies and Drug Stores Done by: Shunwei Wang & Mohammad Zainal Dec. 2002 The retail sale (Million) ABSTRACT The present study aims

### Time Series Analysis III

Lecture 12: Time Series Analysis III MIT 18.S096 Dr. Kempthorne Fall 2013 MIT 18.S096 Time Series Analysis III 1 Outline Time Series Analysis III 1 Time Series Analysis III MIT 18.S096 Time Series Analysis

### Autocovariance and Autocorrelation

Chapter 3 Autocovariance and Autocorrelation If the {X n } process is weakly stationary, the covariance of X n and X n+k depends only on the lag k. This leads to the following definition of the autocovariance

### The SAS Time Series Forecasting System

The SAS Time Series Forecasting System An Overview for Public Health Researchers Charles DiMaggio, PhD College of Physicians and Surgeons Departments of Anesthesiology and Epidemiology Columbia University

### 3. Regression & Exponential Smoothing

3. Regression & Exponential Smoothing 3.1 Forecasting a Single Time Series Two main approaches are traditionally used to model a single time series z 1, z 2,..., z n 1. Models the observation z t as a

### Network Traffic Modeling and Prediction with ARIMA/GARCH

Network Traffic Modeling and Prediction with ARIMA/GARCH Bo Zhou, Dan He, Zhili Sun and Wee Hock Ng Centre for Communication System Research University of Surrey Guildford, Surrey United Kingdom +44(0)

### A Trading Strategy Based on the Lead-Lag Relationship of Spot and Futures Prices of the S&P 500

A Trading Strategy Based on the Lead-Lag Relationship of Spot and Futures Prices of the S&P 500 FE8827 Quantitative Trading Strategies 2010/11 Mini-Term 5 Nanyang Technological University Submitted By:

### STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 6 Three Approaches to Classification Construct

### Master s Thesis. Volume and Volatility in the Icelandic Foreign Exchange Market

Master s Thesis Financial Economics Volume and Volatility in the Icelandic Foreign Exchange Market Sveinn Þórarinsson University of Iceland Department of Economics Advisor: Helgi Tómasson September 2009

### Introduction to Regression and Data Analysis

Statlab Workshop Introduction to Regression and Data Analysis with Dan Campbell and Sherlock Campbell October 28, 2008 I. The basics A. Types of variables Your variables may take several forms, and it

### Chapter 6: Multivariate Cointegration Analysis

Chapter 6: Multivariate Cointegration Analysis 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and und Econometrics Ökonometrie VI. Multivariate Cointegration