microsoft / FLAML

A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.
https://microsoft.github.io/FLAML/
MIT License
3.76k stars 495 forks source link

Problem encountered with the example code of AutoML from the official doc #1207

Closed SeaHI-Robot closed 10 months ago

SeaHI-Robot commented 10 months ago

When I run the example code given in https://microsoft.github.io/FLAML/docs/Examples/AutoML-Classification

from flaml import AutoML
from sklearn.datasets import load_iris

# Initialize an AutoML instance
automl = AutoML()
# Specify automl goal and constraint
automl_settings = {
    "time_budget": 1,  # in seconds
    "metric": 'accuracy',
    "task": 'classification',
    "log_file_name": "iris.log",
}
X_train, y_train = load_iris(return_X_y=True)
# Train with labeled input data
automl.fit(X_train=X_train, y_train=y_train,
           **automl_settings)
# Predict
print(automl.predict_proba(X_train))
# Print the best model
print(automl.model.estimator)

it shows :

TypeError Traceback (most recent call last) Cell In[15], line 15 13 X_train, y_train = load_iris(return_X_y=True) 14 # Train with labeled input data ---> 15 automl.fit(X_train=X_train, y_train=y_train, 16 **automl_settings) 17 # Predict 18 print(automl.predict_proba(X_train))

File e:\miniconda3\lib\site-packages\flaml\automl\automl.py:1925, in AutoML.fit(self, X_train, y_train, dataframe, label, metric, task, n_jobs, log_file_name, estimator_list, time_budget, max_iter, sample, ensemble, eval_method, log_type, model_history, split_ratio, n_splits, log_training_metric, mem_thres, pred_time_limit, train_time_limit, X_val, y_val, sample_weight_val, groups_val, groups, verbose, retrain_full, split_type, learner_selector, hpo_method, starting_points, seed, n_concurrent_trials, keep_search_state, preserve_checkpoint, early_stop, force_cancel, append_log, auto_augment, min_sample_size, use_ray, use_spark, free_mem_ratio, metric_constraints, custom_hp, time_col, cv_score_agg_func, skip_transform, mlflow_logging, fit_kwargs_by_estimator, **fit_kwargs) 1923 with training_log_writer(log_file_name, append_log) as save_helper: 1924 self._training_log = save_helper -> 1925 self._search() 1926 else: 1927 self._training_log = None

File e:\miniconda3\lib\site-packages\flaml\automl\automl.py:2482, in AutoML._search(self) 2480 state.best_config = state.init_config[0] if state.init_config else {} 2481 elif self._use_ray is False and self._use_spark is False: -> 2482 self._search_sequential() 2483 else: 2484 self._search_parallel()

File e:\miniconda3\lib\site-packages\flaml\automl\automl.py:2318, in AutoML._search_sequential(self) ... 213 if logger.level == logging.DEBUG: 214 # xgboost 1.6 doesn't display all the params in the model str 215 logger.debug(f"flaml.model - {model} fit started with params {self.params}")

TypeError: 'NoneType' object is not callable

sonichi commented 10 months ago

It works for me. Did you run pip install "flaml[automl]" first?

SeaHI-Robot commented 10 months ago

Thanks a lot, I guess there are some problems in my python environment. Now it works well!