Open offchan42 opened 2 years ago
@off99555 thank you for reporting the issue. Looks like some problem with CatBoost
. Maybe there was some interface change in the CatBoost
... hard to say. You can try to comment out the CatBoost
algorithm in the AutoML()
constructor - LightGBM
and Xgboost
should work.
I've checked and it seems to be the case that CatBoost is the cause.
@off99555 you mean that there is a bug in CatBoost? Have you created/found a bug issue for them?
I mean that the warning appears when CatBoost is inside the algorithms list. The warning stops when I remove it from the list.
I met this error too. .local/lib/python3.9/site-packages/catboost/core.py:1723: UserWarning: Failed to optimize method "evaluate" in the passed object: Failed in nopython mode pipeline (step: nopython frontend) Untyped global name 'negative_average_precision': Cannot determine Numba type of <class 'function'>
File "../.local/lib/python3.9/site-packages/supervised/utils/metric.py", line 288: def evaluate(self, approxes, target, weight): return -negative_average_precision(target, preds, weight), 0 ^
When I chose
average_precision
as the eval_metric,automl.fit()
responded with this warning repeatedly:Here's the code (without the dataset definition):
How do I avoid this warning?
sklearn version: 1.0.1 mljar version: 0.11.1