Problems That May Occur in Time Series Multiple Regression

 

1.  Multicollinearity.  If one independent variable is excessively linearly correlated with another independent variable, then it will be impossible to determine their separate influences.  The problem is with the data and not with the regression model itself and will be signified three schemes:

            (1) a high R^2 with low values for the t statistics,

            (2)  high values for simple correlation coefficients between the independent variables, and

            (3)  regression coefficients that are sensitive to model specification when both variables are included.  A variable may even take the wrong sign.

 

Multicollinearity may not be a problem in forecasting but it will be a problem is simulation.

 

Solution to the problem is to either (a) combine variables or (b) eliminate one of the variables or, possibly, (c) increase the sample size.

 

Stepwise regression will also eliminate the problem and select the “best” predictor variable to enter when other independent variables are present.

 

2.  Heteroscedasticy.  The error terms are assumed to have a constant variance, but if the variance depends upon the value of X, then heteroscedasticy exists.

           

The Goldfeld-Quandt test for heteroscedasticy may be used.

 

The correction for heteroscedasticy requires the use of (a) weighted least squares, (b) transformed variables, or (c) respecification of the model.

 

3.  Autocorrelation.  Error terms are assumed to be independent over time.  Autocorrelated error terms are the most important consideration for a practicing forecaster.

 

The Durbin-Watson statistic can be used to test for autocorrelation problem.

 

Solutions include (a) finding an important omitted variable (b) transforming the variables based upon generalized least squares and (c) introducing a lagged value of the dependent variable on the right hand side of the equation.