Closed tdhock closed 7 months ago
this may be fixed by removing swap_levels? https://github.com/mlr-org/mlr3learners/blob/main/R/LearnerClassifCVGlmnet.R#L105
Hey @tdhock, thanks for bringing this up. This seems to be intentional to keep it consistent with the behavior of glmnet:
https://github.com/mlr-org/mlr3learners/commit/fbb2275aa0b0f56eaba183a28b307d92999e6df1
Also it seems to be documented under the section Internal Encoding.
This seems to be intentional to keep it consistent with the behavior of glmnet:
You mean glm, right? I guess that is reasonable, but as a user of glmnet, I would expect mlr3 output to match glmnet output (not glm, which I don't use very much, and I was not aware of the alternative convention).
Yes right, glm.
Hi! First of all, thanks for maintaining mlr3, which I find very useful. I noticed that the weights from
mlr3learners::LearnerClassifCVGlmnet
are the opposite/negative of the weights from plain cv.glmnet. I expected they should be the same, or at least this difference should be documented, so is this a bug? Here is some code that can be run to see the difference:The result that I got on my system is shown below:
the important part of the output above is the coef method, which shows that the signs are inverted.