Effect of Regressor Forecast Error on the Variance of Regression Forecasts
|
|
- Griffin Goodwin
- 8 years ago
- Views:
Transcription
1 Journal of Forecasting J. Forecast. 19, 587±600 (2000) Effect of Regressor Forecast Error on the Variance of Regression Forecasts LEONARD J. TASHMAN 1 *{, THORODD BAKKEN 2 AND JEFFREY BUZAS 1 1 University of Vermont, USA 2 Citibank International, Norway ABSTRACT It is well understood that the standard formulation for the variance of a regression-model forecast produces interval estimates that are too narrow, principally because it ignores regressor forecast error. While the theoretical problem has been addressed, there has not been an adequate explanation of the e ect of regressor forecast error, and the empirical literature has supplied a disparate variety of bits and pieces of evidence. Most businessforecasting software programs continue to supply only the standard formulation. This paper extends existing analysis to derive and evaluate large-sample approximations for the forecast error variance in a singleequation regression model. We show how these approximations substantially clarify the expected e ects of regressor forecast error. We then present a case study, which (a) demonstrates how rolling out-of-sample evaluations can be applied to obtain empirical estimates of the forecast error variance, (b) shows that these estimates are consistent with our large-sample approximations and (c) illustrates, for `typical' data, how seriously the standard formulation can understate the forecast error variance. Copyright # 2000 John Wiley & Sons, Ltd. KEY WORDS regression; regressor; ex ante versus ex post forecasts; forecast error variance; relative variance; prediction interval; out-ofsample; rolling evaluation INTRODUCTION It is well understood that the standard formulation for the variance of a regression-model forecast (hence Standard) produces interval estimates that are too narrow. Articulate statements of this problem can be found, among many other places, in Adams (1987, p. 139), Fildes (1985, p. 563), Intrilligator (1978, p. 517), Levenbach and Cleary (1984, p. 241), and Newbold and Bos (1994, p. 8). The source of di culty, in large part, is that the Standard assumes that future values of each regressor are known with certainty. * Correspondence to: Leonard J. Tashman, School of Business Administration, University of Vermont, Burlington, VT 05405, USA. LenTashman@Compuserve.com Copyright # 2000 John Wiley & Sons, Ltd.
2 588 L. J. Tashman et al. The error variance of an h-step-ahead forecast from origin T is represented, for the case of a single regressor, by equation (0), the Standard (Diebold, 1998, p. 293). Incorporated here is uncertainty due to (a) random variation about the true regression function, s 2 and (b) estimation (sampling) error in the regression coe cients. The latter component, in turn, depends also upon sample size and the deviations of future values of the regressor from their in-sample mean. The Standard s 2 1 1=n x T h x 2 =S x t x 2 0 However, error in forecasting out-of-sample values of the regressor (hence, regressor forecast error) introduces an additional source of uncertainty. Hence the forecast error variance is necessarily larger than that given by (0). The Standard also fails to re ect the likelihood that the uncertainty associated with out-of-sample forecasts of the regressor will increase with the lead time of these forecasts. Many textbooks take care to note that the Standard applies to forecasts that are conditional on the assumed values of the regressor. (See, for example, Diebold, 1998, p. 291.) No such admonitions, however, are to be found in the manuals of six, well-known forecasting programs. (See Software List, before the Reference list.) The concern here is that practitioners insouciantly report prediction intervals that are too narrow, perhaps severely so. Two key issues emerge. How much greater will the forecast error variance be (or how much wider will a prediction interval be) if it is also to account for regressor forecast error? Can variance in ation factors be estimated and used as adjustments to the Standard? The empirical literature, reviewed in the next section, supplies bits and pieces of often disparate evidence, mostly gleaned from comparisons of ex post (X known out-of-sample) and ex ante (outof-sample values of X must be forecast) scenarios (see, for example, Osborn and Teal, 1979; Jarrett, 1990). However, because they have not systematically linked measures of forecast error in the dependent variable to regressor forecast error, these studies have not provided useful guidance to practitioners of regression forecasting. In contrast, Feldstein's early analysis (Feldstein, 1971) of the case of stochastic regressors provides a useful starting point. For the dependent variable, let us de ne the relative forecast error variance (hence rfev) as the ratio of the ex ante to ex post forecast error variance. The denominator Ð the ex post forecast error variance Ð is the Standard. In the third section, we extend the Feldstein analysis to derive large-sample approximations for the rfev. These substantially clarify the relationship between forecast error in the dependent variable and regressor forecast error. We then describe a methodology for estimating the rfev from out-of-sample forecast errors. Our methodology employs matching, rolling out-of-sample forecasts of the regressors and the dependent variable. A case study is presented to illustrate, for `typical' data, how seriously the Standard can understate the ex ante forecast error variance. If our large-sample approximation of the rfev is adequate, straightforward adjustments to the Standard can be made to account for regressor forecast error. We use the case study data to test the large-sample approximations against purely empirical measures of the variance of out-ofsample forecast errors. The results show a close match.
3 E ect of Regressor Forecast Error 589 PRIOR EVIDENCE There is ample evidence that regressor forecast error can have serious implications. Jacobs and Sterken (1994) compare ex post and ex ante scenarios for several variables in their macroeconomic model, GUESS, and conclude that `forecast errors to a large extent can be attributed to wrong assumptions for exogenous variables' ( p. 17). Ashley's (1983, 1988) studies demonstrate that, for use of macroeconomic variables as regressors in forecasting models, the incremental error often is so severe as to make inclusion of the regressor in a model more harmful than bene cial. Bassin's (1987) analysis of the errors in regression-model forecasts of quarterly shipments series in 15 industries found that the mean absolute percent forecast errors (MAPEs) were twice as large on average when the (largely macroeconomic) regressors were forecast ex ante as when known values were assumed ex post. In this study, the ex ante forecasts of the regressors were econometric forecasts obtained from Data Resources, Inc. and all forecast error measures represented averages across the forecast horizon of 1±12 quarters. Geriner and Ord (1989) performed ex ante versus ex post evaluations to compare bivariate against univariate forecasting models. The ex post forecasts were made using the known, post-sample values of the explanatory variables. The ex ante forecasts were based on univariate ARIMA projections of the regressor. For their four annual data series, ex ante forecasting accuracy is substantially worse than ex post, at both short and long horizons. For a one-period-ahead forecast, the ex ante measure is 60% higher than the ex post. For the average of 1±6 periods ahead, the ex ante measure is 25 times as large. Curiously, however, for their two monthly and four quarterly series, ex ante forecasting accuracy is no worse than ex post. In seeming contradiction to the studies cited above, Armstrong (1985) writes that of 13 published studies he found Ð all from the 1960s and 1970s Ð ten `mysteriously' showed that ex ante forecasting accuracy was at least as good, if not better, then ex post accuracy. His inference: `The point is quickly reached where greater accuracy in forecasting the causal variables does not lead to greater accuracy in forecasting the dependent variable' ( p. 241). Armstrong's surmise raises additional questions. In the studies cited, how serious were the magnitudes of regressor forecast error? Are there diminishing returns to improve regressor forecasting accuracy? Finally, what sort of invisible hand is at work that benevolently o sets the e ect of regressor forecast error, leaving ex ante accuracy no worse Ð indeed sometimes better Ð than ex post accuracy? Our analysis in the next section will show that, when forecasting is done `automatically' (no user judgement is input), there is no invisible hand at work; that is, in terms of expectations, the uncertainty imparted by regressor forecast error must widen the forecast error variance. Individual departures from the mathematical expectation are, of course, possible. In two of the 15 industries examined by Bassin (1987), the ex ante MAPE was below that of the ex post MAPE. We do nd support for Armstrong's diminishing returns argument, as will be shown in the next section. RELATIVE FORECAST ERROR VARIANCE Making the traditional assumptions that underpin the classical regression model and, in addition, assuming independence of (a) model errors from (b) errors in forecasting the regressors,
4 590 L. J. Tashman et al. Feldstein (1971) derives a general expression for the ex ante forecast error variance in a singleequation regression model (Eqs (3) or (4), p. 56). Pindyck and Rubinfeld present a simpli ed version of the Feldstein equation, in which there is but a single explanatory variable (1991, Eq. (846), p. 197). Neither Feldstein nor Pindyck and Rubinfeld show explicitly how the forecast error variance is a ected by the magnitudes of regressor forecast error. To clarify the relationship between the forecast error variance and regressor forecast error, we have derived large-sample approximations for the rfev, again based upon the single-equation regression model that satis es the traditional assumptions of zero mean and constant variance. The key additional assumptions needed for our derivation are that regressor forecast errors are uncorrelated (a) among themselves and (b) with model errors. A useful property of our expressions is that they are independent of the measurement scales of both the dependent variable and the regressors and thus are relatively easy to interpret. Let the dependent variable y t be related to the vector of regressors ~x t ˆ x 1t ; x 2t ; x 3t ;...; x kt, y t ˆ ~x t b t t ˆ 1;...; T where b is a vector of regression coe cients and t is random error with mean zero and variance s 2. In our large-sample derivations (Appendix A), it will not be necessary to assume that the errors f t g T tˆ1 are uncorrelated. The regressors ~x t are assumed to be stochastic with mean u x and covariance matrix S x. Let ^b represent an estimate of b from the observations fy t ; ~x t g T tˆ1 : Typically ^b is the ordinary least squares estimator. The h-step-ahead forecast, ^y T h ; for the dependent variable when the vector ~x T h is known is and the C% prediction interval for y T is ^y T h ˆ ~x T h ^b ^y T h + t n k;c s ^y T h p where s ^y T h ˆ E y T h ^y T h 2 is the standard deviation of the forecast error and t n k;c is the critical value from the t-distribution with n k degrees of freedom and con dence level C. The exact expression of s ^y T h will depend on how ^b is estimated and assumptions about the correlation structure of f t g T tˆ1 : When the explanatory variables themselves must be forecast, we let ^x T h represent the vector of forecasts from time T. We assume that ^x T h ˆ ~x T h u T h where the vector u T denotes the forecast errors. We assume that u T has mean zero and covariance matrix, s 2 u p S u ˆ 1 s 2 C A ˆ x C A s 2 u k p k s 2 x k
5 E ect of Regressor Forecast Error 591 where p j represents the ratio of the forecast error variance for x j to the variance of x j ; that is, p j ˆ s2 u j s 2 x j The p j represent the portion of the variance of x that is unexplained by the forecasting model for x. The form of the covariance matrix follows from the assumption that regressor forecast errors are uncorrelated among themselves. It is worth noting that if one restricts attention to the case in which regressors are forecast from an autoregressive process, then our assumption that regressor forecast errors are uncorrelated implies that the regressors themselves are uncorrelated. In the more general framework explored here, however, one can assume uncorrelated forecast errors without implying uncorrelated regressors. The forecast for y T when the regressors are forecast is given by: ~y T h ˆ ^x T h ^b For large samples and normally distributed forecast error in both the regressor and the model, the error in forecasting the dependent variable is approximately the di erence between two independent, normal variables and hence approximately normally distributed. The prediction interval thus is of the form ~y T h + t n k;c s ~y T h where the forecast standard deviation s ~y T h ˆ q E y T h ~y T h 2 contains a component for forecast error in the regressors. To describe the increase in the prediction interval due to regressor forecast error, we can look at the relative forecast error variance. In Appendix A, we show that s 2 ~y T h lim s 2 ^y T h ˆ 1 Xk n!1 jˆ1 p j r 2 yx j x j 1 r 2 x j x j 1 r 2 yx j x j 1 where r 2 yx j x j represents the population coe cient of partial determination for adding x j to a model already containing x j ˆ 1; x 1 ;...; x j 1 ; x j 1 ;...; x k 0 ; and r 2 x j x j represents the population coe cient of multiple determination for the regression of x j on x j : The square root of the right-hand side gives the ratio of the width of the ex ante to ex post prediction interval. We note in Appendix A that examining the ratio of prediction errors in the limit is equivalent to assuming that the regression coe cients are known.
6 592 L. J. Tashman et al. TWO SPECIAL CASES OF INTEREST Single explanatory variable Consider the simplest case, in which there is a single regressor, ~x t ˆ 1; x 1 0 and that x 1 is forecast with error. The right-hand side of equation (1) reduces to 1 p 1r 2 1 r 2 2 where r 2 is the square of the correlation between y and x. When sample size is large, the rfev Ð the extent of variance in ation from the Standard that is due to error in forecasting x Ð is seen to depend on two factors: (1) The strength of the relationship between y and x, as measured by r 2 (2) The degree of error in forecasting x, measured by p 1. If r 2 is close to zero, the rfev is close to unity. Hence, if the model supplies a very poor t to the (in-sample) data, the question of accuracy in forecasting x out-of-sample is moot. Conversely, if r 2 is high, any degree, p, of error in forecasting x is considerably leveraged. Achieving forecast accuracy in a regressor becomes more important the better the model ts the in-sample data. Given the in-sample utility of the model Ð summarized by the leverage ratio, r 2 = 1 r 2 Ð the rfev grows in proportion to increases in p. Ifp=1, the rfev Ð and hence the relative width of the prediction interval Ð is determined by the leverage ratio. There is an interesting second-order e ect. By taking the derivative of the square root of equation (2) with respect to p, we nd that a given percentage point reduction in p Ð that is, a unit improvement in forecasting X Ð will generate a larger reduction in the relative width of the prediction interval when p is initially high than when p is initially low. This result lends support to Armstrong's above-mentioned assertion that there are diminishing returns to improved accuracy in forecasting a regressor. However, diminishing returns should be viewed in a relative sense: as shown above, for a model that ts well in-sample, improved accuracy in forecasting X out-ofsample can be worthwhile throughout the range of p. Multiple regressors, only one forecast with error We next consider the case where there are multiple explanatory variables but only x 1 is forecast with error. The extension of the single regressor case is remarkably straightforward. The righthand side of equation (1) becomes p 1 r 2 yx 1 1 x 1 1 r 2 x 1 x 1 1 r 2 yx 1 x 1 3 The expression following the sign can be termed the marginal e ect of regressor x 1.Fromitwe see that the e ect of forecast error in x 1 on the precision of forecasting the dependent variable depends once again on p 1, the degree of forecast error in x 1 as well as on the coe cient of determination re ecting the strength of relationship between x 1 and y. In this case, the term r 2 yx 1 x 1 is a partial coe cient of determination, representing the utility of adding x 1 to a model which already contains the other regressors.
7 E ect of Regressor Forecast Error 593 An additional term comes in as well, r 2 x 1 x 1 ; which is the coe cient of determination in an equation in which x 1 is regressed on x 2 ; x 3 ;...; x k : So this term expresses the degree of collinearity between the given regressor and the others in the equation. The second and third factors are not independent. Rather, as the degree of collinearity increases, the utility of adding x 1 to a model already containing x 2 ; x 3 ;...; x k decreases. Collinearity may mitigate or exacerbate the e ect of regressor forecast error on the rfev. At the other extreme, if x 1 and x 2 ; x 3 ;...; x k are orthogonal, equation (3) reduces to equation (2). It is helpful to note that, in the general case described by equation (1), the rfev is an additive sum of the marginal e ect terms, each of which represents the marginal contribution to the rfev of forecast error in an individual regressor. ESTIMATING THE RELATIVE FORECAST ERROR VARIANCE We will describe our methodology for estimating the rfev within the context of the model with two regressors: y t ˆ b 1 x t b 2 z t t 4 The model contains no lagged variables and t is assumed to have zero mean and constant variance. The estimation relies on the form of the rfev expressed by Appendix A, equation (A1), whose components are the regression coe cients (b j ), the variance of the random error term, s 2, in the regression equation, and the variances of regressor forecast error, s 2 u : Historical series of length n are available. We withhold n T from the series, so that the model is t across periods, 1...T; and used to forecast each test period, T h; where h ˆ 1...n T: The presence of the regressors requires that an assumption be made about the test period values of x and z. The several variations are shown in Table I along with the notation of the type of regression forecast obtained. Table I. Test-period assumptions for the regressors (type of forecasts for y) Both x and z are known y x; z x is known, z is forecast y x; z! x is forecast, z is known y x!; z Both x and z are forecast y x!; z! Our procedure employs rolling out-of-sample evaluations. See Schnaars (1986) for an excellent empirical application and Tashman (2000) for a comprehensive evaluation of the procedure. Normally, rolling out-of-sample evaluations have been applied to compare the forecasting accuracy of extrapolation methods. The application to regression involves some additional considerations, as will now be described. There are two phases. First, after the regression model is estimated over the initial period of t, the t period is successively updated from origin T to origin n 1. At each origin, the regression coe cients b j are recalibrated and a new mean square error obtained (estimate of s 2 ). Then averages are taken of
8 594 L. J. Tashman et al. the individual-origin estimates of the b j and of the individual-origin estimates of s 2. These averages are used as the inputs in Appendix A, equation (A1), for the b j and for s 2. Second, forecasts must be generated for the regressors. Such forecasts can be judgemental, extrapolative, outputs of another model or a mix of the three. For this study, only automatic extrapolations were applied. Doing so not only eliminates judgement as a potentially confounding factor, but provides a statistical basis for measurement of regressor forecast error. Extrapolative forecasts of the regressors were made at each origin T through n 1 and input into a regression equation of a matching t period. For example, regressor forecasts made at origin T 2 were input into the regression equation that is t through period T 2. At each origin, h-step-ahead forecasts for x and z are subtracted from the known values of the regressors during period T h to obtain h-step-ahead regressor forecast errors. Finally, for each regressor and each lead time, the mean square of these errors is calculated and input into Appendix A, equation (A1), for the estimate of s 2 u : In summary, we use Appendix A, equation (A1), to calculate a large-sample approximation of each h-step-ahead refv. The inputs for the b j and s 2 terms are averages of coe cient estimates obtained at each origin T through n 1. The inputs for s 2 u Ð one for each regressor Ð are the mean square regressor forecast errors at h steps ahead. CASE STUDY This case study applies the preceding methodology to illustrate plausible values for the rfev. We have adapted and updated the Harvard Business School Case called Alpha Concrete Products (Harvard College, 1974), a case that examines the use of regression analysis to forecast a company's annual sales revenue (Sales). Two primary regressors were the population of the sales region (Pop) and the number of Building Permits (Perms). The annual series are shown in Appendix B. Overall model t was good, with a multiple R 2 above 095 and residuals showing no evidence of model misspeci cation. The partial coe cients of determination (for the initial t period) were 091 for Pop and 067 for Perms. Forecasts for the regressors were made using an automatic exponential smoothing algorithm. For Pop, a linear trend (Holt's method) was chosen. Out-of-sample forecast errors were small with the parameter, p, measuring the degree of regressor forecast error, being less than 01. Perms, in contrast, was a highly cyclical variable. The automatic algorithm defaulted to a random walk and out of sample forecast error, with p above 075, was substantial. To summarize the characteristics of the regressors: Pop was an extremely important regressor in-sample and could be forecast accurately out of sample. Perms was a statistically signi cant but less important regressor whose out-of-sample forecasting accuracy was very poor. There was a low degree of collinearity between the two regressors. Based on the analysis of the previous section, we know that the e ects on the refv of regressor utility and regressor forecastability are o setting, so that neither regressor in this case study presents an extreme case. We selected the commercial software package tsmetrix to t the regression equations because of its unique capability (Tashman, 2000) to recalibrate regression coe cients in a rolling out-ofsample evaluation.
9 E ect of Regressor Forecast Error 595 In Table II, we report estimates of the relative forecast standard error-square root of the rfev Ð for forecast horizons of 1±5 years. The initial t period was set by withholding the last 7 years of data. Withholding somewhat more data than the longest forecast horizon ensures that the empirical estimates of the error variances at the longest horizon are based on more than a single forecast error. p Table II. Large-sample approximations of the rfev in the Alpha Concrete Products case (base of 100 is the Standard) Forecast (Notation in Table I) Forecast horizon Sales(Pop, Perms!) Sales(Pop!, Perms) Sales(Pop!, Perms!) Each value in the table represents the ratio of the standard error of an ex ante forecast (square root of the forecast error variance) to the standard error of the ex post forecast, Sales(Pop, Perms). The rst line of values Ð Sales(Pop, Perms!) Ð shows the degree of in ation in the standard error of the forecast attributable to the marginal e ect of forecast error in Perms. (In this line, known values of Pop and forecasts of Perms were used to forecast Sales.) The ex ante standard error is 38% higher for one-year-ahead forecasts and more than double the Standard at horizons 2±5. From the second line Ð Sales(Pop!,Perms) Ð we see that, while forecast error in Pop in ates the standard error by only 5% at the rst horizon, the in ation grows dramatically as the forecast horizon lengthens. This pattern is due to the growth in the out-of-sample estimates of the regressor forecast error variance, s 2 u ; as the forecast horizon lengthens. In the general ex ante forecast Ð Sales(Pop!,Perms!) Ð the reported ratios are uniformly highest, an expected result re ecting the additive marginal e ects of individual-regressor forecast error. For a three-year-ahead forecast of Sales, the results show that the ex ante standard error of the forecast is nearly three times that of the Standard. As a rst approximation Ð that is, assuming that the distributional critical values applicable to the distributions of ex post forecast errors were applicable to the ex ante errors as well Ð we could say that the prediction interval for the 3-year-ahead forecast should be nearly three times as wide as the prediction interval the forecaster will be shown by forecasting software. If these large-sample approximations are adequate, the adjustments required to the Standard to account for regressor forecast error are reasonably straightforward. An estimate of the degree of regressor forecast error (p) must be made for each lead time. Although this can be done judgementally, an automatic extrapolation would provide an e cient macro for the regression routines. The remaining two components (Appendix A, equation (A1)) are byproducts of any regression algorithm. Software developers should be encouraged to make the relatively minor adaptations required to facilitate these calculations. TESTING THE LARGE-SAMPLE APPROXIMATION In this section, we compare the results in Table II against empirical measures of the variance of out-of-sample forecast errors.
10 596 L. J. Tashman et al. Empirical out-of-sample forecast errors were a byproduct of the rolling out-of-sample evaluations of the prior sections. Based on each combination of known and forecasted values of the regressors, as shown in Table I, we generated forecasts of the dependent variable from each origin, T (year 15) to n 1 (year 21). When collated by lead time, the result is a collection of seven one-step-ahead forecasts, six two-step-ahead forecasts and so forth through three ve-step-ahead forecasts. We used the actual values of the dependent variable through the test period to calculate the forecast errors and then, for each group of speci c lead-time errors, we computed the variance as the mean of the squared errors. Shown in Table III are the results from the complete ex ante forecasting equation: Sales(Pop!, Perms!). Results for the partial ex ante forecasts are very similar. The weights used in the weighted average represent the number of forecasts recorded at each lead time: this was seven one-year-ahead forecasts down to three ve-year-ahead forecasts. Table III. Large-sample approximations versus purely empirical estimates of the Concrete Products case (both regressors forecast ex ante) p rfev in the Alpha Forecast horizon Wt ave. Large-sample approximation Purely empirical calculation The purely empirical calculations and our large-sample approximations appears to be a close match. The two types of estimates of the rfev are consistent in demonstrating (a) in ation of the forecast error variance in face of regressor forecast error and (b) the tendency of the in ation factors to increase with the lead time of the forecast. (There is a reduction in the purely empirical rfev at lead 5; however, empirical measures can be erratic.) The proximity of the results in Table III can, of course, be at least partly a chance occurrence, and evidence based on a single case study can be considered no more than indicative. Nonetheless, the ndings are encouraging and, in our judgement, suggest that the proposed adjustments to the Standard to account for regressor forecast error are worthy of further investigation. SUMMARY In this paper, we have reported large-sample approximations for the relative forecast error variance of a single-equation regression model. The assumptions made are that the regressor forecast errors are uncorrelated with themselves and with the model errors. The results show that, under these assumptions, the rfev depends upon three parameters:. The incremental utility of adding a regressor to a model, r 2 yx j x j. The degree of error in forecasting the regressor out-of-sample, p. The degree of multicollinearity between the regressor in question and the set of remaining regressors in the model, rx 2 j x j
11 E ect of Regressor Forecast Error 597 These factors interact; for example, a given degree of error in forecasting a regressor has a more powerful e ect on the forecast error variance the greater the utility of adding that regressor to the model. We have shown how estimates can be made of the rfev using rolling out-of-sample evaluations with matching t periods for the regressors and dependent p variable. When applied to a case study using `typical data', the results suggested that the rfev (a) grows with the lead time of the forecast, re ecting increases in the variance of regressor forecast error over the forecast horizon and (b) can readily exceed a factor of 2, which is to say that regressor forecast error can more than double the width of a prediction interval. Finally we compared our large-sample approximations of the rfev to purely empirical calculations of the out-of-sample forecast error variance and found the two sets of estimates to be very close. Our conclusion is that the large-sample approximations show promise as a valid basis for calculation of ex ante prediction intervals and are worthy of further empirical evaluations. APPENDIX A: PROOF OF EQUATION (1) Here we establish the identity given in equation (1). We rst show that s 2 ~y lim s 2 ^y ˆ 1 Xk n!1 jˆ1 b 2 j s2 u j s 2 A1 and we then show that b 2 j s2 u j s 2 ˆ p j r 2 yx j x j 1 r 2 x j x j 1 r 2 yx j x j A2 from which the result follows. Examining the ratio of prediction errors in the limit is equivalent to assuming that b is known. Then, and s 2 ^y T h ˆE y T h ^y T h 2 ˆ E y T h x 0 T h b 2 ˆ s 2 s 2 ~y T h ˆE y T h ~y T h 2 ˆ E y T h ^x 0 T b 2 ˆ E Y T h x 0 T h b u0 T h b 2 ˆ E y T h x 0 T h b 2 2Eu 0 T h b Y T h x 0 T h b E u0 T h b 2 ˆ s 2 b 0 S u b ˆ s 2 Xk b 2 j s2 u j jˆ1 In the middle line of the expression for s 2 ~y T h ; the cross-product term is zero under the assumption that regressor forecast error, u T h ; and model error, T h ; are uncorrelated. In the
12 598 L. J. Tashman et al. nal line, the b 0 S u b term becomes the summation of the products X k b 2 j s2 u j jˆ2 under the assumption that regressor forecast errors are uncorrelated among themselves. Taking the ratio of the nal expression for s 2 ^y T h to s 2 ~y T h establishes equation (A1). To establish equation (A2), let SSE x j represent the sum of squared errors for the regression of y t on x j : A straightforward application of the expectation of a quadratic form shows that E SSE x j Š ˆ s 2 n k 1 b 2 E SSE x j on x j Š where SSE x j on x j represents the sum of squared errors for the regression of x j on x j : It is not di cult to show that E [SSE x j on x j Š ˆ n k 1 1 r 2 x j x j s 2 x j : The population coe cient of partial determination for adding x j to a model already containing x j is, by de nition, r 2 yx j x j ˆ lim n!1 SSE x j SSE x j ; x j SSE x j where SSE x j ; x j represents the sum of squared errors from the regression of y t on x j ; x j : Then it follows that r 2 yx j x j ˆ b2 j 1 r2 x j x j s 2 x j s 2 b 2 j 1 r2 x j x j s 2 x j Straightforward algebra leads to equation (A2) and equation (1) follows immediately from this. APPENDIX B: THE ALPHA CONCRETE PRODUCTS DATA Year Sales (dollars) Pop (# of people) Perms (# of permits) ,904, , ,868, , ,303, , ,888, , ,879, , ,947, , ,905, , ,442,447 1,009, ,327,223 1,019, ,237,503 1,029, ,921,922 1,047, ,619,577 1,059, ,863,210 1,099, ,853,870 1,128,000 12,777
13 E ect of Regressor Forecast Error ,979,262 1,157,000 17, ,954,110 1,195,500 13, ,402,350 1,234, ,580,030 1,275, ,662,405 1,316,000 10, ,090,037 1,364,000 13, ,760,224 1,416,000 13, ,685,524 1,472,000 11,505 SOFTWARE LIST (1) Forecast Pro for Windows, Version 3 (1997), Business Forecast Systems, Belmont, CA. (2) Minitab, Release 12 (1998), Minitab Inc., State College, PA. (3) SAS/ETS, Release 6 (1997), SAS Institute, Inc., Cary, NC. (4) SmartForecasts for Windows, Version 1 (1997), SmartSoftware Inc., Belmont, CA. (5) SPSS Trends, Version 75 (1997), SPSS, Inc., Chicago, IL. (6) TsMetrix, Version 2 (1997), RER, Inc., San Diego, SA. The paper refers only to the regression options in these packages. In SAS/ETS and other ARIMA packages, ARIMA procedures can be used by the sophisticated analyst to obtain forecast standard errors that incorporate regressor forecast error. To do so one can (a) create a univariate forecast for each regressor (b) use a transfer function to reproduce a regression model containing those regressors. However, this approach cannot be viewed as a satisfactory surrogate for many practitioners of regression-based forecasting: For one, it is infeasible if the time series are too short for ARIMA modeling. It does not allow for judgemental inputs of regressor forecast error variance. Also the software capability is not widely available. Of the six packages listed above, for example, only SAS/ETS o ers the requisite ARIMA technology. ACKNOWLEDGEMENTS Many thanks go to former University of Vermont students Michael Brodie and Peter Tashman for very signi cant contributions to early stages of this research, and to Professor William Bassin of Shippensburg University for his constant support and feedback over the long life of this project. REFERENCES Adams FG The Business Forecasting Revolution. Oxford University Press: New York. Armstrong JS Long-Range Forecasting, 2nd edn. Wiley-Interscience: New York. Ashley R On the usefulness of macroeconomic forecasts as inputs to forecasting models. Journal of Forecasting 2: 211±223.
14 600 L. J. Tashman et al. Ashley R On the relative worth of recent economic forecasts. International Journal of Forecasting 4: 363±376. Bassin WM How to anticipate the accuracy of a regression model. Journal of Business Forecasting: Methods & Systems 6: 26±28. Diebold F Elements of Forecasting. South-Western: Cincinnati. Feldstein M The error of forecast in econometric models when the forecast-period exogenous variables are stochastic. Econometrica 39: 55±60. Fildes R Quantitative forecasting Ð the state of the art: econometric models. Journal of the Operational Research Society 36: 549±580. Geriner PT, Ord JK Automatic forecasting using explanatory variables: a comparative study. International Journal of Forecasting 7: 127±140. Intrilligator MD Econometric Methods, Techniques and Applications. Prentice Hall: Englewood Cli s, NJ. Jacobs J, Sterken E Macroeconomic models and portfolio investment. In International Symposium on Forecasting, Stockholm. Jarrett J Improving forecasts by decomposing the error. Journal of Business Forecasting: Methods & Systems 9: 12±15. Levenbach H, Cleary JP The Modern Forecaster. Lifetime Learning Publications: Belmont, CA. Newbold P, Bos T Introductory Business and Economic Forecasting, 2nd edn. South-Western: Cincinnati. Osborn DR, Teal F An assessment and comparison of two NIESR econometric model forecasts. National Institute Economic Review 27: 50±62. Pindyck RS, Rubinfeld DL Econometric Models and Economic Forecasts, 3rd edn. McGraw-Hill: New York. Schnaars SP A comparison of extrapolation procedures on yearly sales forecasts. International Journal of Forecasting 2: 71±85. Tashman LJ Out-of-sample tests of forecast accuracy: an analysis and review. International Journal of Forecasting 16: forthcoming. Authors' biographies: Leonard J. Tashman has spent half his life on the faculty of the School of Business Administration of the University of Vermont. Forecasts of his near-term retirement are probably accurate. Thorodd Bakken received his B.S. and MBA degrees from the School of Business Administration of the University of Vermont. He works for Citigroup, as a foreign exchange and interest rate dealer. He was a national team cross country skier in Norway, and four time NCAA champion in the USA. Je rey Buzas is Associate Professor of Statistics in the Department of Mathematics and Statistics at the University of Vermont. His research interests include covariate measurement error in non-linear regression models. Authors' addresses: Leonard J. Tashman, School of Business Administration, University of Vermont, Burlington, VT 05405, USA. Thorodd Bakken, Citibank International plc, Norway Branch, Tordenskiolds Gate 8-10, P.O. Box 1481 Vika, 0116 Oslo, Norway. Je rey Buzas, Department of Mathematics and Statistics, University of Vermont, Burlington, VT 05405, USA.
Normalization and Mixed Degrees of Integration in Cointegrated Time Series Systems
Normalization and Mixed Degrees of Integration in Cointegrated Time Series Systems Robert J. Rossana Department of Economics, 04 F/AB, Wayne State University, Detroit MI 480 E-Mail: r.j.rossana@wayne.edu
More informationChapter 1. Vector autoregressions. 1.1 VARs and the identi cation problem
Chapter Vector autoregressions We begin by taking a look at the data of macroeconomics. A way to summarize the dynamics of macroeconomic data is to make use of vector autoregressions. VAR models have become
More informationChapter 4: Vector Autoregressive Models
Chapter 4: Vector Autoregressive Models 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and und Econometrics Ökonometrie IV.1 Vector Autoregressive Models (VAR)...
More informationForecasting in supply chains
1 Forecasting in supply chains Role of demand forecasting Effective transportation system or supply chain design is predicated on the availability of accurate inputs to the modeling process. One of the
More informationIDENTIFICATION OF DEMAND FORECASTING MODEL CONSIDERING KEY FACTORS IN THE CONTEXT OF HEALTHCARE PRODUCTS
IDENTIFICATION OF DEMAND FORECASTING MODEL CONSIDERING KEY FACTORS IN THE CONTEXT OF HEALTHCARE PRODUCTS Sushanta Sengupta 1, Ruma Datta 2 1 Tata Consultancy Services Limited, Kolkata 2 Netaji Subhash
More informationForecasting the sales of an innovative agro-industrial product with limited information: A case of feta cheese from buffalo milk in Thailand
Forecasting the sales of an innovative agro-industrial product with limited information: A case of feta cheese from buffalo milk in Thailand Orakanya Kanjanatarakul 1 and Komsan Suriya 2 1 Faculty of Economics,
More informationChapter 3: The Multiple Linear Regression Model
Chapter 3: The Multiple Linear Regression Model Advanced Econometrics - HEC Lausanne Christophe Hurlin University of Orléans November 23, 2013 Christophe Hurlin (University of Orléans) Advanced Econometrics
More informationFORECASTING SOFTWARE IN PRACTICE: USE, SATISFACTION, AND PERFORMANCE
FORECASTING SOFTWARE IN PRACTICE: USE, SATISFACTION, AND PERFORMANCE Nada R. Sanders Professor of Operations Management Department of Management Science & Information Systems Raj Soin College of Business
More informationChapter 2. Dynamic panel data models
Chapter 2. Dynamic panel data models Master of Science in Economics - University of Geneva Christophe Hurlin, Université d Orléans Université d Orléans April 2010 Introduction De nition We now consider
More informationForecasting Geographic Data Michael Leonard and Renee Samy, SAS Institute Inc. Cary, NC, USA
Forecasting Geographic Data Michael Leonard and Renee Samy, SAS Institute Inc. Cary, NC, USA Abstract Virtually all businesses collect and use data that are associated with geographic locations, whether
More informationCh.3 Demand Forecasting.
Part 3 : Acquisition & Production Support. Ch.3 Demand Forecasting. Edited by Dr. Seung Hyun Lee (Ph.D., CPL) IEMS Research Center, E-mail : lkangsan@iems.co.kr Demand Forecasting. Definition. An estimate
More informationARE STOCK PRICES PREDICTABLE? by Peter Tryfos York University
ARE STOCK PRICES PREDICTABLE? by Peter Tryfos York University For some years now, the question of whether the history of a stock's price is relevant, useful or pro table in forecasting the future price
More informationSENSITIVITY ANALYSIS AND INFERENCE. Lecture 12
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
More information1 Another method of estimation: least squares
1 Another method of estimation: least squares erm: -estim.tex, Dec8, 009: 6 p.m. (draft - typos/writos likely exist) Corrections, comments, suggestions welcome. 1.1 Least squares in general Assume Y i
More informationA Primer on Forecasting Business Performance
A Primer on Forecasting Business Performance There are two common approaches to forecasting: qualitative and quantitative. Qualitative forecasting methods are important when historical data is not available.
More informationStatistics in Retail Finance. Chapter 6: Behavioural models
Statistics in Retail Finance 1 Overview > So far we have focussed mainly on application scorecards. In this chapter we shall look at behavioural models. We shall cover the following topics:- Behavioural
More informationCAPM, Arbitrage, and Linear Factor Models
CAPM, Arbitrage, and Linear Factor Models CAPM, Arbitrage, Linear Factor Models 1/ 41 Introduction We now assume all investors actually choose mean-variance e cient portfolios. By equating these investors
More informationMGT 267 PROJECT. Forecasting the United States Retail Sales of the Pharmacies and Drug Stores. Done by: Shunwei Wang & Mohammad Zainal
MGT 267 PROJECT Forecasting the United States Retail Sales of the Pharmacies and Drug Stores Done by: Shunwei Wang & Mohammad Zainal Dec. 2002 The retail sale (Million) ABSTRACT The present study aims
More informationModels for Product Demand Forecasting with the Use of Judgmental Adjustments to Statistical Forecasts
Page 1 of 20 ISF 2008 Models for Product Demand Forecasting with the Use of Judgmental Adjustments to Statistical Forecasts Andrey Davydenko, Professor Robert Fildes a.davydenko@lancaster.ac.uk Lancaster
More informationTIME SERIES ANALYSIS
TIME SERIES ANALYSIS Ramasubramanian V. I.A.S.R.I., Library Avenue, New Delhi- 110 012 ram_stat@yahoo.co.in 1. Introduction A Time Series (TS) is a sequence of observations ordered in time. Mostly these
More informationI. Introduction. II. Background. KEY WORDS: Time series forecasting, Structural Models, CPS
Predicting the National Unemployment Rate that the "Old" CPS Would Have Produced Richard Tiller and Michael Welch, Bureau of Labor Statistics Richard Tiller, Bureau of Labor Statistics, Room 4985, 2 Mass.
More informationPromotional Analysis and Forecasting for Demand Planning: A Practical Time Series Approach Michael Leonard, SAS Institute Inc.
Promotional Analysis and Forecasting for Demand Planning: A Practical Time Series Approach Michael Leonard, SAS Institute Inc. Cary, NC, USA Abstract Many businesses use sales promotions to increase the
More information4. Simple regression. QBUS6840 Predictive Analytics. https://www.otexts.org/fpp/4
4. Simple regression QBUS6840 Predictive Analytics https://www.otexts.org/fpp/4 Outline The simple linear model Least squares estimation Forecasting with regression Non-linear functional forms Regression
More informationADVANCED FORECASTING MODELS USING SAS SOFTWARE
ADVANCED FORECASTING MODELS USING SAS SOFTWARE Girish Kumar Jha IARI, Pusa, New Delhi 110 012 gjha_eco@iari.res.in 1. Transfer Function Model Univariate ARIMA models are useful for analysis and forecasting
More information16 : Demand Forecasting
16 : Demand Forecasting 1 Session Outline Demand Forecasting Subjective methods can be used only when past data is not available. When past data is available, it is advisable that firms should use statistical
More informationOptimization of technical trading strategies and the profitability in security markets
Economics Letters 59 (1998) 249 254 Optimization of technical trading strategies and the profitability in security markets Ramazan Gençay 1, * University of Windsor, Department of Economics, 401 Sunset,
More informationNote 2 to Computer class: Standard mis-specification tests
Note 2 to Computer class: Standard mis-specification tests Ragnar Nymoen September 2, 2013 1 Why mis-specification testing of econometric models? As econometricians we must relate to the fact that the
More informationForecasting Methods / Métodos de Previsão Week 1
Forecasting Methods / Métodos de Previsão Week 1 ISCTE - IUL, Gestão, Econ, Fin, Contab. Diana Aldea Mendes diana.mendes@iscte.pt February 3, 2011 DMQ, ISCTE-IUL (diana.mendes@iscte.pt) Forecasting Methods
More informationPanel Data Econometrics
Panel Data Econometrics Master of Science in Economics - University of Geneva Christophe Hurlin, Université d Orléans University of Orléans January 2010 De nition A longitudinal, or panel, data set is
More informationTime series Forecasting using Holt-Winters Exponential Smoothing
Time series Forecasting using Holt-Winters Exponential Smoothing Prajakta S. Kalekar(04329008) Kanwal Rekhi School of Information Technology Under the guidance of Prof. Bernard December 6, 2004 Abstract
More informationSimple Linear Regression Inference
Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation
More informationPortfolio selection based on upper and lower exponential possibility distributions
European Journal of Operational Research 114 (1999) 115±126 Theory and Methodology Portfolio selection based on upper and lower exponential possibility distributions Hideo Tanaka *, Peijun Guo Department
More informationLeast Squares Estimation
Least Squares Estimation SARA A VAN DE GEER Volume 2, pp 1041 1045 in Encyclopedia of Statistics in Behavioral Science ISBN-13: 978-0-470-86080-9 ISBN-10: 0-470-86080-4 Editors Brian S Everitt & David
More informationContinued Fractions and the Euclidean Algorithm
Continued Fractions and the Euclidean Algorithm Lecture notes prepared for MATH 326, Spring 997 Department of Mathematics and Statistics University at Albany William F Hammond Table of Contents Introduction
More informationA technical analysis approach to tourism demand forecasting
Applied Economics Letters, 2005, 12, 327 333 A technical analysis approach to tourism demand forecasting C. Petropoulos a, K. Nikolopoulos b, *, A. Patelis a and V. Assimakopoulos c a Forecasting Systems
More informationJames E. Bartlett, II is Assistant Professor, Department of Business Education and Office Administration, Ball State University, Muncie, Indiana.
Organizational Research: Determining Appropriate Sample Size in Survey Research James E. Bartlett, II Joe W. Kotrlik Chadwick C. Higgins The determination of sample size is a common task for many organizational
More informationOverview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model
Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model 1 September 004 A. Introduction and assumptions The classical normal linear regression model can be written
More informationResearch Division Federal Reserve Bank of St. Louis Working Paper Series
Research Division Federal Reserve Bank of St. Louis Working Paper Series Comment on "Taylor Rule Exchange Rate Forecasting During the Financial Crisis" Michael W. McCracken Working Paper 2012-030A http://research.stlouisfed.org/wp/2012/2012-030.pdf
More informationForecasting Using Eviews 2.0: An Overview
Forecasting Using Eviews 2.0: An Overview Some Preliminaries In what follows it will be useful to distinguish between ex post and ex ante forecasting. In terms of time series modeling, both predict values
More informationTeaching model: C1 a. General background: 50% b. Theory-into-practice/developmental 50% knowledge-building: c. Guided academic activities:
1. COURSE DESCRIPTION Degree: Double Degree: Derecho y Finanzas y Contabilidad (English teaching) Course: STATISTICAL AND ECONOMETRIC METHODS FOR FINANCE (Métodos Estadísticos y Econométricos en Finanzas
More informationMarketing Mix Modelling and Big Data P. M Cain
1) Introduction Marketing Mix Modelling and Big Data P. M Cain Big data is generally defined in terms of the volume and variety of structured and unstructured information. Whereas structured data is stored
More informationAPPLICATION OF LINEAR REGRESSION MODEL FOR POISSON DISTRIBUTION IN FORECASTING
APPLICATION OF LINEAR REGRESSION MODEL FOR POISSON DISTRIBUTION IN FORECASTING Sulaimon Mutiu O. Department of Statistics & Mathematics Moshood Abiola Polytechnic, Abeokuta, Ogun State, Nigeria. Abstract
More information3. Regression & Exponential Smoothing
3. Regression & Exponential Smoothing 3.1 Forecasting a Single Time Series Two main approaches are traditionally used to model a single time series z 1, z 2,..., z n 1. Models the observation z t as a
More informationUSING SEASONAL AND CYCLICAL COMPONENTS IN LEAST SQUARES FORECASTING MODELS
Using Seasonal and Cyclical Components in Least Squares Forecasting models USING SEASONAL AND CYCLICAL COMPONENTS IN LEAST SQUARES FORECASTING MODELS Frank G. Landram, West Texas A & M University Amjad
More informationSTATISTICA Formula Guide: Logistic Regression. Table of Contents
: Table of Contents... 1 Overview of Model... 1 Dispersion... 2 Parameterization... 3 Sigma-Restricted Model... 3 Overparameterized Model... 4 Reference Coding... 4 Model Summary (Summary Tab)... 5 Summary
More informationIDENTIFICATION IN A CLASS OF NONPARAMETRIC SIMULTANEOUS EQUATIONS MODELS. Steven T. Berry and Philip A. Haile. March 2011 Revised April 2011
IDENTIFICATION IN A CLASS OF NONPARAMETRIC SIMULTANEOUS EQUATIONS MODELS By Steven T. Berry and Philip A. Haile March 2011 Revised April 2011 COWLES FOUNDATION DISCUSSION PAPER NO. 1787R COWLES FOUNDATION
More informationSimple Methods and Procedures Used in Forecasting
Simple Methods and Procedures Used in Forecasting The project prepared by : Sven Gingelmaier Michael Richter Under direction of the Maria Jadamus-Hacura What Is Forecasting? Prediction of future events
More informationTime Series Analysis: Basic Forecasting.
Time Series Analysis: Basic Forecasting. As published in Benchmarks RSS Matters, April 2015 http://web3.unt.edu/benchmarks/issues/2015/04/rss-matters Jon Starkweather, PhD 1 Jon Starkweather, PhD jonathan.starkweather@unt.edu
More information2. Simple Linear Regression
Research methods - II 3 2. Simple Linear Regression Simple linear regression is a technique in parametric statistics that is commonly used for analyzing mean response of a variable Y which changes according
More informationIntegrated Resource Plan
Integrated Resource Plan March 19, 2004 PREPARED FOR KAUA I ISLAND UTILITY COOPERATIVE LCG Consulting 4962 El Camino Real, Suite 112 Los Altos, CA 94022 650-962-9670 1 IRP 1 ELECTRIC LOAD FORECASTING 1.1
More informationIs the Forward Exchange Rate a Useful Indicator of the Future Exchange Rate?
Is the Forward Exchange Rate a Useful Indicator of the Future Exchange Rate? Emily Polito, Trinity College In the past two decades, there have been many empirical studies both in support of and opposing
More informationUnivariate and Multivariate Methods PEARSON. Addison Wesley
Time Series Analysis Univariate and Multivariate Methods SECOND EDITION William W. S. Wei Department of Statistics The Fox School of Business and Management Temple University PEARSON Addison Wesley Boston
More information" Y. Notation and Equations for Regression Lecture 11/4. Notation:
Notation: Notation and Equations for Regression Lecture 11/4 m: The number of predictor variables in a regression Xi: One of multiple predictor variables. The subscript i represents any number from 1 through
More informationTIME SERIES ANALYSIS
TIME SERIES ANALYSIS L.M. BHAR AND V.K.SHARMA Indian Agricultural Statistics Research Institute Library Avenue, New Delhi-0 02 lmb@iasri.res.in. Introduction Time series (TS) data refers to observations
More informationNon Linear Dependence Structures: a Copula Opinion Approach in Portfolio Optimization
Non Linear Dependence Structures: a Copula Opinion Approach in Portfolio Optimization Jean- Damien Villiers ESSEC Business School Master of Sciences in Management Grande Ecole September 2013 1 Non Linear
More information5. Linear Regression
5. Linear Regression Outline.................................................................... 2 Simple linear regression 3 Linear model............................................................. 4
More informationReview of Bivariate Regression
Review of Bivariate Regression A.Colin Cameron Department of Economics University of California - Davis accameron@ucdavis.edu October 27, 2006 Abstract This provides a review of material covered in an
More informationMULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS
MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS MSR = Mean Regression Sum of Squares MSE = Mean Squared Error RSS = Regression Sum of Squares SSE = Sum of Squared Errors/Residuals α = Level of Significance
More informationChapter 27 Using Predictor Variables. Chapter Table of Contents
Chapter 27 Using Predictor Variables Chapter Table of Contents LINEAR TREND...1329 TIME TREND CURVES...1330 REGRESSORS...1332 ADJUSTMENTS...1334 DYNAMIC REGRESSOR...1335 INTERVENTIONS...1339 TheInterventionSpecificationWindow...1339
More informationChapter 6: Multivariate Cointegration Analysis
Chapter 6: Multivariate Cointegration Analysis 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and und Econometrics Ökonometrie VI. Multivariate Cointegration
More informationVaR-x: Fat tails in nancial risk management
VaR-x: Fat tails in nancial risk management Ronald Huisman, Kees G. Koedijk, and Rachel A. J. Pownall To ensure a competent regulatory framework with respect to value-at-risk (VaR) for establishing a bank's
More informationTesting for Granger causality between stock prices and economic growth
MPRA Munich Personal RePEc Archive Testing for Granger causality between stock prices and economic growth Pasquale Foresti 2006 Online at http://mpra.ub.uni-muenchen.de/2962/ MPRA Paper No. 2962, posted
More informationOn time series data and optimal parameters
Available online at www.sciencedirect.com Omega 32 (2004) 111 120 www.elsevier.com/locate/dsw On time series data and optimal parameters Rasmus Rasmussen Institute of Economics, Molde University College,
More informationModule 6: Introduction to Time Series Forecasting
Using Statistical Data to Make Decisions Module 6: Introduction to Time Series Forecasting Titus Awokuse and Tom Ilvento, University of Delaware, College of Agriculture and Natural Resources, Food and
More information11. Time series and dynamic linear models
11. Time series and dynamic linear models Objective To introduce the Bayesian approach to the modeling and forecasting of time series. Recommended reading West, M. and Harrison, J. (1997). models, (2 nd
More informationMachine Learning in Statistical Arbitrage
Machine Learning in Statistical Arbitrage Xing Fu, Avinash Patra December 11, 2009 Abstract We apply machine learning methods to obtain an index arbitrage strategy. In particular, we employ linear regression
More informationIntroduction to Logistic Regression
OpenStax-CNX module: m42090 1 Introduction to Logistic Regression Dan Calderon This work is produced by OpenStax-CNX and licensed under the Creative Commons Attribution License 3.0 Abstract Gives introduction
More informationIBM SPSS Forecasting 22
IBM SPSS Forecasting 22 Note Before using this information and the product it supports, read the information in Notices on page 33. Product Information This edition applies to version 22, release 0, modification
More informationWorking Papers. Cointegration Based Trading Strategy For Soft Commodities Market. Piotr Arendarski Łukasz Postek. No. 2/2012 (68)
Working Papers No. 2/2012 (68) Piotr Arendarski Łukasz Postek Cointegration Based Trading Strategy For Soft Commodities Market Warsaw 2012 Cointegration Based Trading Strategy For Soft Commodities Market
More informationIntroduction to Regression and Data Analysis
Statlab Workshop Introduction to Regression and Data Analysis with Dan Campbell and Sherlock Campbell October 28, 2008 I. The basics A. Types of variables Your variables may take several forms, and it
More informationExample G Cost of construction of nuclear power plants
1 Example G Cost of construction of nuclear power plants Description of data Table G.1 gives data, reproduced by permission of the Rand Corporation, from a report (Mooz, 1978) on 32 light water reactor
More informationForecasting methods applied to engineering management
Forecasting methods applied to engineering management Áron Szász-Gábor Abstract. This paper presents arguments for the usefulness of a simple forecasting application package for sustaining operational
More informationIntroduction. Who Should Read This Book?
This book provides a quantitative, technical treatment of portfolio risk analysis with a focus on real-world applications. It is intended for both academic and practitioner audiences, and it draws its
More informationThe Best of Both Worlds:
The Best of Both Worlds: A Hybrid Approach to Calculating Value at Risk Jacob Boudoukh 1, Matthew Richardson and Robert F. Whitelaw Stern School of Business, NYU The hybrid approach combines the two most
More informationSection A. Index. Section A. Planning, Budgeting and Forecasting Section A.2 Forecasting techniques... 1. Page 1 of 11. EduPristine CMA - Part I
Index Section A. Planning, Budgeting and Forecasting Section A.2 Forecasting techniques... 1 EduPristine CMA - Part I Page 1 of 11 Section A. Planning, Budgeting and Forecasting Section A.2 Forecasting
More informationA Review of Methods for Missing Data
Educational Research and Evaluation 1380-3611/01/0704-353$16.00 2001, Vol. 7, No. 4, pp. 353±383 # Swets & Zeitlinger A Review of Methods for Missing Data Therese D. Pigott Loyola University Chicago, Wilmette,
More informationA Comparative Study of the Pickup Method and its Variations Using a Simulated Hotel Reservation Data
A Comparative Study of the Pickup Method and its Variations Using a Simulated Hotel Reservation Data Athanasius Zakhary, Neamat El Gayar Faculty of Computers and Information Cairo University, Giza, Egypt
More informationUnit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression
Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression Objectives: To perform a hypothesis test concerning the slope of a least squares line To recognize that testing for a
More informationEmpirical Methods in Applied Economics
Empirical Methods in Applied Economics Jörn-Ste en Pischke LSE October 2005 1 Observational Studies and Regression 1.1 Conditional Randomization Again When we discussed experiments, we discussed already
More informationComparative Study of Demand Forecast Accuracy for Healthcare Products Using Linear and Non Linear Regression
International Journal of Business and Management Invention ISSN (Online): 2319 8028, ISSN (Print): 2319 801X Volume 3 Issue 5ǁ May. 2014 ǁ PP.01-10 Comparative Study of Demand Forecast Accuracy for Healthcare
More informationOnline Appendix to Impatient Trading, Liquidity. Provision, and Stock Selection by Mutual Funds
Online Appendix to Impatient Trading, Liquidity Provision, and Stock Selection by Mutual Funds Zhi Da, Pengjie Gao, and Ravi Jagannathan This Draft: April 10, 2010 Correspondence: Zhi Da, Finance Department,
More informationDimensionality Reduction: Principal Components Analysis
Dimensionality Reduction: Principal Components Analysis In data mining one often encounters situations where there are a large number of variables in the database. In such situations it is very likely
More informationDISCRIMINANT FUNCTION ANALYSIS (DA)
DISCRIMINANT FUNCTION ANALYSIS (DA) John Poulsen and Aaron French Key words: assumptions, further reading, computations, standardized coefficents, structure matrix, tests of signficance Introduction Discriminant
More informationAnalysis of Bayesian Dynamic Linear Models
Analysis of Bayesian Dynamic Linear Models Emily M. Casleton December 17, 2010 1 Introduction The main purpose of this project is to explore the Bayesian analysis of Dynamic Linear Models (DLMs). The main
More informationMaster of Mathematical Finance: Course Descriptions
Master of Mathematical Finance: Course Descriptions CS 522 Data Mining Computer Science This course provides continued exploration of data mining algorithms. More sophisticated algorithms such as support
More informationReplicating portfolios revisited
Prepared by: Alexey Botvinnik Dr. Mario Hoerig Dr. Florian Ketterer Alexander Tazov Florian Wechsung SEPTEMBER 2014 Replicating portfolios revisited TABLE OF CONTENTS 1 STOCHASTIC MODELING AND LIABILITIES
More informationInflation. Chapter 8. 8.1 Money Supply and Demand
Chapter 8 Inflation This chapter examines the causes and consequences of inflation. Sections 8.1 and 8.2 relate inflation to money supply and demand. Although the presentation differs somewhat from that
More informationA. GDP, Economic Growth, and Business Cycles
ECON 3023 Hany Fahmy FAll, 2009 Lecture Note: Introduction and Basic Concepts A. GDP, Economic Growth, and Business Cycles A.1. Gross Domestic Product (GDP) de nition and measurement The Gross Domestic
More informationEconometrics Simple Linear Regression
Econometrics Simple Linear Regression Burcu Eke UC3M Linear equations with one variable Recall what a linear equation is: y = b 0 + b 1 x is a linear equation with one variable, or equivalently, a straight
More informationWhat drives the quality of expert SKU-level sales forecasts relative to model forecasts?
19th International Congress on Modelling and Simulation, Perth, Australia, 1 16 December 011 http://mssanz.org.au/modsim011 What drives the quality of expert SKU-level sales forecasts relative to model
More informationAccurately and Efficiently Measuring Individual Account Credit Risk On Existing Portfolios
Accurately and Efficiently Measuring Individual Account Credit Risk On Existing Portfolios By: Michael Banasiak & By: Daniel Tantum, Ph.D. What Are Statistical Based Behavior Scoring Models And How Are
More informationOverview of Factor Analysis
Overview of Factor Analysis Jamie DeCoster Department of Psychology University of Alabama 348 Gordon Palmer Hall Box 870348 Tuscaloosa, AL 35487-0348 Phone: (205) 348-4431 Fax: (205) 348-8648 August 1,
More informationEconometric Modelling for Revenue Projections
Econometric Modelling for Revenue Projections Annex E 1. An econometric modelling exercise has been undertaken to calibrate the quantitative relationship between the five major items of government revenue
More informationThe Method of Least Squares
Hervé Abdi 1 1 Introduction The least square methods (LSM) is probably the most popular technique in statistics. This is due to several factors. First, most common estimators can be casted within this
More informationDecision-Based Forecast Evaluation of UK Interest Rate Predictability*
DEPARTMENT OF ECONOMICS Decision-Based Forecast Evaluation of UK Interest Rate Predictability* Stephen Hall, University of Leicester, UK Kevin Lee, University of Leicester, UK Kavita Sirichand, University
More informationBooth School of Business, University of Chicago Business 41202, Spring Quarter 2015, Mr. Ruey S. Tsay. Solutions to Midterm
Booth School of Business, University of Chicago Business 41202, Spring Quarter 2015, Mr. Ruey S. Tsay Solutions to Midterm Problem A: (30 pts) Answer briefly the following questions. Each question has
More informationPremaster Statistics Tutorial 4 Full solutions
Premaster Statistics Tutorial 4 Full solutions Regression analysis Q1 (based on Doane & Seward, 4/E, 12.7) a. Interpret the slope of the fitted regression = 125,000 + 150. b. What is the prediction for
More informationFairfield Public Schools
Mathematics Fairfield Public Schools AP Statistics AP Statistics BOE Approved 04/08/2014 1 AP STATISTICS Critical Areas of Focus AP Statistics is a rigorous course that offers advanced students an opportunity
More informationSales forecasting # 2
Sales forecasting # 2 Arthur Charpentier arthur.charpentier@univ-rennes1.fr 1 Agenda Qualitative and quantitative methods, a very general introduction Series decomposition Short versus long term forecasting
More informationAgenda. Managing Uncertainty in the Supply Chain. The Economic Order Quantity. Classic inventory theory
Agenda Managing Uncertainty in the Supply Chain TIØ485 Produkjons- og nettverksøkonomi Lecture 3 Classic Inventory models Economic Order Quantity (aka Economic Lot Size) The (s,s) Inventory Policy Managing
More information