rstudio / bundle

Prepare objects for serialization with a consistent interface
https://rstudio.github.io/bundle/
Other
27 stars 4 forks source link

Bundling XGBoost objects removes variable names when applying xgb.importance() #66

Open joranE opened 1 month ago

joranE commented 1 month ago

I'm aware of the extended discussion in #50 but I was still somewhat surprised that an xgboost model object after bundling & unbundling returns garbled variable names when trying to call xgb.importance. I understand that the primary purpose of bundling is to preserve the ability to make new predictions, but I wasn't expecting to lose functionality from non-tidymodels functions form the xgboost package itself.

I understand why various tidymodels functionality wouldn't be preserved, but I would have expected that the object would be passable to functions in the xgboost package designed to work on an xgb.Booster object and get the same results.

Is this also the intended behavior of bundling, in which case I should save the variable importance information prior to bundling if I require it?

> set.seed(1)
> 
> data(agaricus.train)
> data(agaricus.test)
> 
> xgb <- xgboost(data = agaricus.train$data, label = agaricus.train$label,
+                max_depth = 2, eta = 1, nthread = 2, nrounds = 2,
+                objective = "binary:logistic")
[1] train-logloss:0.233376 
[2] train-logloss:0.136658 
> 
> xgboost::xgb.importance(model = xgb)
                   Feature       Gain     Cover Frequency
                    <char>      <num>     <num>     <num>
1:               odor=none 0.67615470 0.4978746       0.4
2:         stalk-root=club 0.17135376 0.1920543       0.2
3:       stalk-root=rooted 0.12317236 0.1638750       0.2
4: spore-print-color=green 0.02931918 0.1461960       0.2
> 
> xgb_bundle <- bundle(xgb)
> 
> xgboost::xgb.importance(model = unbundle(xgb_bundle))
   Feature       Gain     Cover Frequency
    <char>      <num>     <num>     <num>
1:     f28 0.67615470 0.4978746       0.4
2:     f55 0.17135376 0.1920543       0.2
3:     f59 0.12317236 0.1638750       0.2
4:    f108 0.02931918 0.1461960       0.2
juliasilge commented 1 month ago

Oh, that's interesting! 👀 Notice that you can observe the same problem without using bundle at all, but instead xgboost::xgb.save.raw() and xgboost::xgb.load.raw() (which is how bundle stores xgboost models):

library(xgboost)

set.seed(1)

data(agaricus.train)
data(agaricus.test)

mod <- xgboost(
  data = agaricus.train$data, label = agaricus.train$label,
  max_depth = 2, eta = 1, nthread = 2, nrounds = 2,
  objective = "binary:logistic"
)
#> [1]  train-logloss:0.233376 
#> [2]  train-logloss:0.136658

object <- xgboost::xgb.save.raw(mod, raw_format = "ubj")
res <- xgboost::xgb.load.raw(object, as_booster = TRUE)
xgboost::xgb.importance(model = res)
#>    Feature       Gain     Cover Frequency
#>     <char>      <num>     <num>     <num>
#> 1:     f28 0.67615470 0.4978746       0.4
#> 2:     f55 0.17135376 0.1920543       0.2
#> 3:     f59 0.12317236 0.1638750       0.2
#> 4:    f108 0.02931918 0.1461960       0.2

Created on 2024-07-29 with reprex v2.1.0

I think the source of this problem is probably the same as https://github.com/dmlc/xgboost/issues/5018 (the feature names need to be stored into the booster for raw_format = "ubj"), if you would like to share the problem you are running into there.

Thanks!