robjhyndman / M4metalearning

116 stars 49 forks source link

Target variable for finding weights #7

Closed eddiepyang closed 4 years ago

eddiepyang commented 6 years ago

I am not sure I understood the methodology regarding using xgboost to find the weights for each forecast method. Is it correct that the independent variables are the features generated and the targets are the owa errors from the forecasts?

pmontman commented 5 years ago

I would say that the the target variable is 'implicit', since normally it would be the vector of weights that minimize the OWA errors. These OWA errors are introduced in the custom loss function similar to the role of a target variable, but strictly speaking you do not want xgboost to predict the OWA errors, but to give weights to minimize them. You can think that the OWA errors are introduced to reweight the per-class and per-instance errors when minimizing the loss in a traditional multiclass classification problem, with the target as the algorithm from the set that produces the least amount of OWA.