Closed flippercy closed 3 years ago
Is it specific to lrl1? There is no use of '_' in the code related to lrl1.
It is; all the ML learners, including customized ones, are fine.
Sent from my iPhone
On Apr 3, 2021, at 10:43 AM, Chi Wang @.***> wrote:
Is it specific to lrl1? There is no use of '_' in the code related to lrl1.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/microsoft/FLAML/issues/57#issuecomment-812882474, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AMMP5HPO4BO7OTEI2XDEECDTG4ZQ5ANCNFSM42JKMCRQ.
I got a similar error when running the notebook example in RStudio via reticulate:
from flaml import AutoML
from flaml.data import load_openml_dataset
X_train, X_test, y_train, y_test = load_openml_dataset(dataset_id = 1169, data_dir = './')
from flaml import AutoML
automl = AutoML()
settings = {
"time_budget": 60, # total running time in seconds
"metric": 'roc_auc',
"task": 'classification', # task type
"estimator_list":['lrl1', 'lgbm'],
"log_file_name": 'airlines_experiment.log', # flaml log file
}
automl.fit(X_train = X_train, y_train = y_train, **settings)
[flaml.automl: 04-05 09:20:34] {884} INFO - Evaluation method: holdout [flaml.automl: 04-05 09:20:35] {591} INFO - Using StratifiedKFold [flaml.automl: 04-05 09:20:35] {905} INFO - Minimizing error metric: 1-roc_auc [flaml.automl: 04-05 09:20:35] {925} INFO - List of ML learners in AutoML Run: ['lrl1', 'lgbm'] [flaml.automl: 04-05 09:20:35] {986} INFO - iteration 0 current learner lrl1 TypeError: object of type 'AutoML' has no len()
It only happens with lrl1; all the other default learners and my customized learners are fine.
The versions of packages installed in that environment are:
reticulate:::pip_freeze(python='/data/home/r-reticulate/bin/python') package version requirement 1 alembic 1.5.4 alembic==1.5.4 2 argon2-cffi 20.1.0 argon2-cffi==20.1.0 3 async-generator 1.10 async-generator==1.10 4 attrs 20.3.0 attrs==20.3.0 5 backcall 0.2.0 backcall==0.2.0 6 bleach 3.3.0 bleach==3.3.0 7 catboost 0.24.4 catboost==0.24.4 8 certifi 2020.12.5 certifi==2020.12.5 9 cffi 1.14.5 cffi==1.14.5 10 chardet 4.0.0 chardet==4.0.0 11 cliff 3.7.0 cliff==3.7.0 12 cmaes 0.8.1 cmaes==0.8.1 13 cmd2 1.5.0 cmd2==1.5.0 14 colorama 0.4.4 colorama==0.4.4 15 colorlog 4.7.2 colorlog==4.7.2 16 cycler 0.10.0 cycler==0.10.0 17 decorator 5.0.5 decorator==5.0.5 18 defusedxml 0.7.1 defusedxml==0.7.1 19 entrypoints 0.3 entrypoints==0.3 20 FLAML 0.2.10 FLAML==0.2.10 21 graphviz 0.16 graphviz==0.16 22 idna 2.10 idna==2.10 23 importlib-metadata 3.4.0 importlib-metadata==3.4.0 24 ipykernel 5.5.3 ipykernel==5.5.3 25 ipython 7.16.1 ipython==7.16.1 26 ipython-genutils 0.2.0 ipython-genutils==0.2.0 27 ipywidgets 7.6.3 ipywidgets==7.6.3 28 jedi 0.18.0 jedi==0.18.0 29 Jinja2 2.11.3 Jinja2==2.11.3 30 joblib 1.0.1 joblib==1.0.1 31 jsonschema 3.2.0 jsonschema==3.2.0 32 jupyter 1.0.0 jupyter==1.0.0 33 jupyter-client 6.1.12 jupyter-client==6.1.12 34 jupyter-console 6.4.0 jupyter-console==6.4.0 35 jupyter-core 4.7.1 jupyter-core==4.7.1 36 jupyterlab-pygments 0.1.2 jupyterlab-pygments==0.1.2 37 jupyterlab-widgets 1.0.0 jupyterlab-widgets==1.0.0 38 kiwisolver 1.3.1 kiwisolver==1.3.1 39 liac-arff 2.5.0 liac-arff==2.5.0 40 lightgbm 3.1.1 lightgbm==3.1.1 41 Mako 1.1.4 Mako==1.1.4 42 MarkupSafe 1.1.1 MarkupSafe==1.1.1 43 matplotlib 3.2.0 matplotlib==3.2.0 44 mistune 0.8.4 mistune==0.8.4 45 nbclient 0.5.3 nbclient==0.5.3 46 nbconvert 6.0.7 nbconvert==6.0.7 47 nbformat 5.1.3 nbformat==5.1.3 48 nest-asyncio 1.5.1 nest-asyncio==1.5.1 49 notebook 6.3.0 notebook==6.3.0 50 numpy 1.19.5 numpy==1.19.5 51 openml 0.10.2 openml==0.10.2 52 optuna 2.3.0 optuna==2.3.0 53 packaging 20.9 packaging==20.9 54 pandas 1.1.5 pandas==1.1.5 55 pandocfilters 1.4.3 pandocfilters==1.4.3 56 parso 0.8.2 parso==0.8.2 57 pbr 5.5.1 pbr==5.5.1 58 pexpect 4.8.0 pexpect==4.8.0 59 pickleshare 0.7.5 pickleshare==0.7.5 60 Pillow 8.1.0 Pillow==8.1.0 61 plotly 4.14.3 plotly==4.14.3 62 prettytable 2.0.0 prettytable==2.0.0 63 prometheus-client 0.10.0 prometheus-client==0.10.0 64 prompt-toolkit 3.0.18 prompt-toolkit==3.0.18 65 ptyprocess 0.7.0 ptyprocess==0.7.0 66 pycparser 2.20 pycparser==2.20 67 Pygments 2.8.1 Pygments==2.8.1 68 pyparsing 2.4.7 pyparsing==2.4.7 69 pyperclip 1.8.1 pyperclip==1.8.1 70 pyrsistent 0.17.3 pyrsistent==0.17.3 71 python-dateutil 2.8.1 python-dateutil==2.8.1 72 python-editor 1.0.4 python-editor==1.0.4 73 pytz 2021.1 pytz==2021.1 74 PyYAML 5.4.1 PyYAML==5.4.1 75 pyzmq 22.0.3 pyzmq==22.0.3 76 qtconsole 5.0.3 qtconsole==5.0.3 77 QtPy 1.9.0 QtPy==1.9.0 78 requests 2.25.1 requests==2.25.1 79 retrying 1.3.3 retrying==1.3.3 80 rgf-python 3.9.0 rgf-python==3.9.0 81 scikit-learn 0.24.1 scikit-learn==0.24.1 82 scipy 1.5.4 scipy==1.5.4 83 Send2Trash 1.5.0 Send2Trash==1.5.0 84 six 1.15.0 six==1.15.0 85 SQLAlchemy 1.3.23 SQLAlchemy==1.3.23 86 stevedore 3.3.0 stevedore==3.3.0 87 terminado 0.9.4 terminado==0.9.4 88 testpath 0.4.4 testpath==0.4.4 89 threadpoolctl 2.1.0 threadpoolctl==2.1.0 90 tornado 6.1 tornado==6.1 91 tqdm 4.56.2 tqdm==4.56.2 92 traitlets 4.3.3 traitlets==4.3.3 93 typing-extensions 3.7.4.3 typing-extensions==3.7.4.3 94 urllib3 1.26.4 urllib3==1.26.4 95 wcwidth 0.2.5 wcwidth==0.2.5 96 webencodings 0.5.1 webencodings==0.5.1 97 widgetsnbextension 3.5.1 widgetsnbextension==3.5.1 98 xgboost 1.3.3 xgboost==1.3.3 99 xmltodict 0.12.0 xmltodict==0.12.0 100 zipp 3.4.0 zipp==3.4.0
Any ideas?
Thank you!
@flippercy Does this issue still exist in the latest version?
@sonichi: I just checked and the issue does not exist anymore. How did you fix it? Just curious.
Thank you.
I haven't done anything specifically for this issue. I just thought it's worth trying again with the many updates we had. It's good to see the problem is gone now.
Hi:
I've received the error message below with lrl1 when using FLAML in RStudio via reticulate:
[flaml.automl: 04-02 14:24:16] {986} INFO - iteration 0 current learner lrl1 NameError: name '_' is not defined
Interestingly, the same codes ran well in Jupyter. The versions of scikit-learn in the two environments are the same.
Any ideas?
Thank you.