Hi there,
I am having trouble understanding how the 'rescale' parameter affects the model results. From what I understand, the time series is scaled by a factor of say 100 to help the optimizer find a solution. As a result, the model parameters, conditional volatility and residuals are also shifted by this factor.
So my first question is if the arch package could be extended by a function to re-rescale the arch results? That is, converting all model results to the same scale the model would be in, if I were to estimate with rescale=False. I used to do this manually after the estimation. However, I recently noticed an issue in model forecasts when adding an AR model to the mean equation.
Here is a toy example to replicate the issue:
y = pd.read_csv("https://cdn.cboe.com/api/global/us_indices/daily_prices/VIX_History.csv",parse_dates=["DATE"], index_col="DATE", usecols=["DATE","CLOSE"]).assign(CLOSE = lambda x: x["CLOSE"]/100).values.ravel()
model = arch_model(y,p=1,o=0,q=1,mean="AR",lags=1,vol="GARCH",dist="normal",rescale=False,).fit()
model_rescale = arch_model(y,p=1,o=0,q=1,mean="AR",lags=1,vol="GARCH",dist="normal",rescale=True,).fit()
print(model.forecast(horizon=3, reindex=False,).variance)
print(model_rescale.forecast(horizon=3, reindex=False,).variance)
These are the results I get:
h.1
h.2
h.3
model
0.000077
0.000148
0.000214
model_rescale
0.892588
8613.118014
8.310370e+07
(model_rescale.scale is 100.0)
Since this is a variance forecast, I assume that the estimator for h.1 of model_rescale must be divided by 100.^2 to arrive at the same scaling as in the unscaled model. However, I wonder why this rescaling factor continues to be dragged along in the h.2. and h.3 forecasts? This causes the values to explode after a few iterations. The problem does not occur if the mean is set to zero, so my conclusion is that the rescale factor probably should not be included in the AR recursions. Thus, the problem would not occur if the model parameters were "re-rescaled" before calculating the forecasts. Am I missing something here or is this a bug?
Hi there, I am having trouble understanding how the 'rescale' parameter affects the model results. From what I understand, the time series is scaled by a factor of say 100 to help the optimizer find a solution. As a result, the model parameters, conditional volatility and residuals are also shifted by this factor. So my first question is if the arch package could be extended by a function to re-rescale the arch results? That is, converting all model results to the same scale the model would be in, if I were to estimate with rescale=False. I used to do this manually after the estimation. However, I recently noticed an issue in model forecasts when adding an AR model to the mean equation.
Here is a toy example to replicate the issue:
These are the results I get:
(model_rescale.scale is 100.0)
Since this is a variance forecast, I assume that the estimator for h.1 of model_rescale must be divided by 100.^2 to arrive at the same scaling as in the unscaled model. However, I wonder why this rescaling factor continues to be dragged along in the h.2. and h.3 forecasts? This causes the values to explode after a few iterations. The problem does not occur if the mean is set to zero, so my conclusion is that the rescale factor probably should not be included in the AR recursions. Thus, the problem would not occur if the model parameters were "re-rescaled" before calculating the forecasts. Am I missing something here or is this a bug?