Closed kkckk1110 closed 7 months ago
Hey @kkckk1110, thanks for using mlforecast. We use the recursive strategy by default, which means that the model predictions are used as inputs to recompute the features in every forecasting step. By using only the lag 1 you're basically providing the same feature in every step. Since you're forecasting 12 steps ahead you can try also adding the lag 12. Other features like lag transformations would help as well.
Thanks for your attention! That is to say, in each step to forecast, the model receives the value predicted in last step as a feature?
the model receives the value predicted in last step as a feature?
If you use lag1 yes. If you use a higher lag like 12 here for example, you would get the real values from your series in the previous year, which will most likely help your model.
Thanks a lot! I have fixed the problem.
What happened + What you expected to happen
I am training a xgboost with lag1 and predicting the following 12 months. However, the results forecasted by the model were nearly equal at different time points, which was quite weird. the following is a sample from the results. the yellow line is the true value while the blue line represents the xgboost forecast results.
Versions / Dependencies
I am working on mlforecast==0.11.5.
Reproduction script
models = [XGBRegressor(random_state=42, n_estimators=500, learning_rate=params['learning_rate'], max_depth=params['max_depth'], min_child_weight=params['min_child_weight'], subsample=params['subsample'], colsample_bytree=params['colsample_bytree'])]
model = MLForecast(models=models,lags=[1],freq='MS')
model.fit(train, id_col='unique_id', time_col='ds', target_col='sales', static_features=[],fitted=True)
the columns of train is: unique_id, ds, sales
p = model.predict(h)
Issue Severity
None