Closed Roh-codeur closed 1 year ago
the closest I could get this to work was as below:
boost = xgboost((X, Y), WeightedLossGradient, WeightedLossHessian;
num_round=numberOfRounds,
eta = learningRate,
tree_method=treeMethod,
XGBoost.classification(objective="binary:logistic", eval_metric= ["auc", "aucpr", "logloss"])...)
That error is telling you that you need to provide num_class=n
(I think in your case n == 2
) as an argument to xgboost
. We probably should make XGBoost.classification
handle that more elegantly.
thanks @ExpandingMan : I got it to work as below:
df = DataFrame(a=rand(Int, 10), b=randn(10), y=BitArray(rand(Bool, 10)))
boost = xgboost((df[!, [:a, :b]], df.y), Grad, Hess;
num_round=100,
eta = 0.3,
tree_method="hist",
XGBoost.classification(objective="binary:logistic", eval_metric= ["auc", "aucpr", "logloss"])...)
hi
I have the below code to predict binary outcomes.
I tried running
Got:
Tried:
xgboost((X, Y), WeightedLossGradient, WeightedLossHessian; num_class=2, XGBoost.classification(eval_metric= ["auc", "aucpr", "logloss"])...)
Got:How do I set custom objective in this call please?
thanks Roh