mljar / mljar-supervised

Python package for AutoML on Tabular Data with Feature Engineering, Hyper-Parameters Tuning, Explanations and Automatic Documentation
https://mljar.com
MIT License
2.97k stars 392 forks source link

'module' object is not callable #721

Open tuomassiren opened 2 months ago

tuomassiren commented 2 months ago

Any ideas what might be causing the "'module' object is not callable" error?

mljar-scikit-plot 0.3.10 mljar-supervised 1.1.7


Linear algorithm was disabled.
AutoML directory:  xxx
The task is binary_classification with evaluation metric logloss
AutoML will use algorithms: ['Random Forest', 'LightGBM', 'Xgboost', 'CatBoost', 'Neural Network']
AutoML will ensemble available models
AutoML steps: ['simple_algorithms', 'default_algorithms', 'not_so_random', 'golden_features', 'insert_random_feature', 'features_selection', 'hill_climbing_1', 'hill_climbing_2', 'ensemble']
Skip simple_algorithms because no parameters were generated.
* Step default_algorithms will try to check up to 5 models
'module' object is not callable
1_Default_LightGBM logloss 0.614364 trained in 49.53 seconds (1-sample predict time 0.0425 seconds)
'module' object is not callable
2_Default_Xgboost logloss 0.61475 trained in 26.49 seconds (1-sample predict time 0.0461 seconds)
'module' object is not callable
3_Default_CatBoost logloss 0.613929 trained in 19.38 seconds (1-sample predict time 0.0501 seconds)
'module' object is not callable

...

Code doesn't do anything out of the ordinary.

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.2, random_state = 7)
model = AutoML(mode="Perform", results_path=model_path, total_time_limit=14400)
model = model.fit(X_train, y_train)
pplonski commented 2 months ago

Hi @tuomassiren,

might be some bug, we need to reproduce issue and investigate @Bocianski

tuomassiren commented 2 months ago

Downgraded to 1.1.6, doesn't have the error.

pplonski commented 2 months ago

Thanks @tuomassiren for info, might be some problem with newer packages and api changes in them. We will investigate this. Thank you!