microsoft / FLAML

A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.
https://microsoft.github.io/FLAML/
MIT License
3.91k stars 508 forks source link

Kernel panic / Segmentation fault when trying to run example #223

Closed angela97lin closed 3 years ago

angela97lin commented 3 years ago

Hi! I tried to install and run a simple example, but ran into a Segmentation Fault: 11 error when using Python 3.9.6:


from sklearn import datasets

def X_y():
    X, y = datasets.make_classification(n_samples=100, n_features=20,
                                        n_informative=2, n_redundant=2, random_state=0)

    return X, y

from flaml import AutoML
automl = AutoML()
automl.fit(X, y, task="classification")

Running this in both my Jupyter notebook and via bash gives me:

$ python3 innodays_flaml.py
[flaml.automl: 09-24 10:45:44] {1431} INFO - Evaluation method: cv
[flaml.automl: 09-24 10:45:44] {1477} INFO - Minimizing error metric: 1-accuracy
[flaml.automl: 09-24 10:45:44] {1514} INFO - List of ML learners in AutoML Run: ['lgbm', 'rf', 'catboost', 'xgboost', 'extra_tree', 'lrl1']
[flaml.automl: 09-24 10:45:44] {1746} INFO - iteration 0, current learner lgbm
Segmentation fault: 11
angela97lin commented 3 years ago

I also tried this with 3.7.9, and got the same error. Here are my packages:

appnope==0.1.2
argcomplete==1.12.3
argon2-cffi==21.1.0
attrs==21.2.0
backcall==0.2.0
bleach==4.1.0
catboost==0.26.1
certifi==2021.5.30
cffi==1.14.6
charset-normalizer==2.0.6
cycler==0.10.0
debugpy==1.4.3
decorator==5.1.0
defusedxml==0.7.1
entrypoints==0.3
FLAML==0.6.4
graphviz==0.17
idna==3.2
importlib-metadata==4.8.1
ipykernel==6.4.1
ipython==7.27.0
ipython-genutils==0.2.0
ipywidgets==7.6.5
jedi==0.18.0
Jinja2==3.0.1
joblib==1.0.1
jsonschema==3.2.0
jupyter==1.0.0
jupyter-client==7.0.3
jupyter-console==6.4.0
jupyter-core==4.8.1
jupyter-tabnine==1.2.3
jupyterlab-pygments==0.1.2
jupyterlab-widgets==1.0.2
kiwisolver==1.3.2
liac-arff==2.5.0
lightgbm==3.2.1
MarkupSafe==2.0.1
matplotlib==3.2.0
matplotlib-inline==0.1.3
mistune==0.8.4
nbclient==0.5.4
nbconvert==6.1.0
nbformat==5.1.3
nest-asyncio==1.5.1
notebook==6.4.4
numpy==1.21.2
openml==0.10.2
packaging==21.0
pandas==1.3.3
pandocfilters==1.5.0
parso==0.8.2
pexpect==4.8.0
pickleshare==0.7.5
Pillow==8.3.2
plotly==5.3.1
prometheus-client==0.11.0
prompt-toolkit==3.0.20
ptyprocess==0.7.0
pycparser==2.20
Pygments==2.10.0
pyparsing==2.4.7
pyrsistent==0.18.0
python-dateutil==2.8.2
pytz==2021.1
pyzmq==22.3.0
qtconsole==5.1.1
QtPy==1.11.2
requests==2.26.0
rgf-python==3.11.0
scikit-learn==1.0
scipy==1.7.1
Send2Trash==1.8.0
six==1.16.0
tenacity==8.0.1
terminado==0.12.1
testpath==0.5.0
threadpoolctl==2.2.0
tornado==6.1
traitlets==5.1.0
typing-extensions==3.10.0.2
urllib3==1.26.7
wcwidth==0.2.5
webencodings==0.5.1
widgetsnbextension==3.5.1
xgboost==1.3.3
xmltodict==0.12.0
zipp==3.5.0

These packages were installed in a fresh virtualenv, with only pip install flaml and pip install flaml[notebook] ran.

sonichi commented 3 years ago

In your code snippet, X, y looks undefined. I modified your code as the following and it works for me in Python 3.7.9.

from sklearn import datasets

X, y = datasets.make_classification(n_samples=100, n_features=20,
                                    n_informative=2, n_redundant=2, random_state=0)

from flaml import AutoML
automl = AutoML()
automl.fit(X, y, task="classification")
angela97lin commented 3 years ago

@sonichi My bad, that was a copy-and-paste error. Even with X and y properly defined, the code segfaults!

Did you try with a fresh virtual environment? Oddly enough, in a separate virtual environment with many other packages installed, the code runs fine, so I wonder if there's a package missing?

sonichi commented 3 years ago

Yes, it works for me in a fresh conda environment of Python 3.7.9 and pip install flaml. The packages installed are: Package Version


astroid 2.3.3 backcall 0.2.0 catboost 0.26.1 certifi 2021.5.30 colorama 0.4.3 cycler 0.10.0 decorator 4.4.2 FLAML 0.6.4 graphviz 0.17 ipykernel 5.3.4 ipython 7.19.0 ipython-genutils 0.2.0 isort 4.3.21 jedi 0.17.2 joblib 1.0.1 jupyter-client 6.1.7 jupyter-core 4.7.0 kiwisolver 1.3.2 lazy-object-proxy 1.4.3 lightgbm 3.2.1 matplotlib 3.2.0 mccabe 0.6.1 numpy 1.21.2 pandas 1.3.3 parso 0.7.1 pickleshare 0.7.5 pip 21.2.2 plotly 5.3.1 prompt-toolkit 3.0.8 Pygments 2.7.2 pylint 2.4.4 pyparsing 2.4.7 python-dateutil 2.8.2 pytz 2021.1 pyzmq 20.0.0 rgf-python 3.9.0 rope 0.16.0 scikit-learn 1.0 scipy 1.7.1 setuptools 58.0.4 six 1.14.0 tenacity 8.0.1 threadpoolctl 2.2.0 tornado 6.1 traitlets 5.0.5 typed-ast 1.4.1 wheel 0.37.0 wincertstore 0.2 wrapt 1.11.2 xgboost 1.3.3

angela97lin commented 3 years ago

Not sure if this makes a difference, but I'm not using conda! 🤔

sonichi commented 3 years ago

I tried venv with Python 3.7.5 and it works too. I ran pip install -U pip before pip install flaml because the default pip version is too low in the venv. Packages: Package Version


catboost 0.26.1 cycler 0.10.0 FLAML 0.6.4 graphviz 0.17 joblib 1.0.1 kiwisolver 1.3.2 lightgbm 3.2.1 matplotlib 3.4.3 numpy 1.21.2 pandas 1.3.3 Pillow 8.3.2 pip 21.2.4 pkg_resources 0.0.0 plotly 5.3.1 pyparsing 2.4.7 python-dateutil 2.8.2 pytz 2021.1 scikit-learn 1.0 scipy 1.7.1 setuptools 39.0.1 six 1.16.0 tenacity 8.0.1 threadpoolctl 2.2.0 wheel 0.37.0 xgboost 1.3.3

angela97lin commented 3 years ago

I tried again with a new environment, but still no luck. Here are the exact commands I ran:

(flaml) $ python3 --version
Python 3.7.9

pip install -U pip
pip install flaml

Dependencies look like:

catboost==0.26.1
cycler==0.10.0
FLAML==0.6.4
graphviz==0.17
joblib==1.0.1
kiwisolver==1.3.2
lightgbm==3.2.1
matplotlib==3.4.3
numpy==1.21.2
pandas==1.3.3
Pillow==8.3.2
plotly==5.3.1
pyparsing==2.4.7
python-dateutil==2.8.2
pytz==2021.1
scikit-learn==1.0
scipy==1.7.1
six==1.16.0
tenacity==8.0.1
threadpoolctl==2.2.0
xgboost==1.3.3

Currently using pip 21.2.4.

Could this be related to OS? I'm using macOS Big Sur, version 11.5.2. 🤔

qingyun-wu commented 3 years ago

Hi @angela97lin, I wonder is your Mac computer with Intel chip or Apple M1? Thank you!

yue-msr commented 3 years ago

Hi, I tried 3.7.9 and 3.9.6 in Windows. They both worked fine.

angela97lin commented 3 years ago

I have an Intel chip: 2.3 GHz 8-Core Intel Core i9

ekzhu commented 3 years ago

Hi I tried Python 3.8.6 on a duo-core intel core i7, MacOS Big Sur. It works fine for me.

sonichi commented 3 years ago

@angela97lin Could you try pip install --no-cache-dir flaml? Maybe some package is corrupted in the cache.

qingyun-wu commented 3 years ago

Hi @angela97lin, I tried Python 3.8.5 and Python 3.7.5 with MacOS Big Sur and Intel chip. Both work fine for me. Can you try re-installing with the --no-cache-dir option as suggested by sonichi?

angela97lin commented 3 years ago

Thanks for all of the replies! I've tried with pip install --no-cache-dir flaml, but still no luck.

I noticed that it usually segfaults during lightgbm / xgboost. When I update my code to:


from sklearn import datasets

X, y = datasets.make_classification(n_samples=100, n_features=20,
                                    n_informative=2, n_redundant=2, random_state=0)

from flaml import AutoML
automl = AutoML()
automl.fit(X, y, task="classification", estimator_list=['rf', 'catboost', 'extra_tree', 'lrl1'])

Everything runs to completion!

sonichi commented 3 years ago

Do lightgbm and xgboost work on your machine without flaml?

angela97lin commented 3 years ago

Yup, they do.

Not sure if this is helpful, but everything works fine in a virtual environment that I use day-to-day for development. This virtual environment has many more packages installed, listed below. I only run into this issue when I try to create a new virtual environment.

Packages in virtual environment that does not cause FLAML to segfault:

alabaster==0.7.12
amqp==5.0.6
ansible==4.2.0
ansible-core==2.11.2
apipkg==1.5
appdirs==1.4.4
appnope==0.1.2
argon2-cffi==20.1.0
astroid==2.6.5
async-generator==1.10
atomicwrites==1.4.0
attrs==21.2.0
autopep8==1.4.3
Babel==2.9.1
backcall==0.2.0
beautifulsoup4==4.9.3
billiard==3.6.4.0
black==21.5b1
bleach==3.3.0
boto3==1.17.74
botocore==1.20.74
cached-property==1.5.2
catboost==0.26
category-encoders==2.2.2
celery==5.1.2
certifi==2020.12.5
cffi==1.14.5
chardet==4.0.0
click==7.1.2
click-didyoumean==0.0.3
click-plugins==1.1.1
click-repl==0.2.0
cloudpickle==1.6.0
codecov==2.1.11
colorama==0.4.4
coverage==5.5
cryptography==3.4.7
cycler==0.10.0
Cython==0.29.17
darglint==1.8.0
dask==2021.8.1
decorator==4.4.2
defusedxml==0.7.1
distributed==2021.8.1
docformatter==1.4
docker==5.0.0
docutils==0.16
ecdsa==0.17.0
entrypoints==0.3
execnet==1.8.0
Faker==8.11.0
fastparquet==0.3.1
featuretools==0.26.1
flake8==3.7.1
FLAML==0.6.4
Flask==2.0.1
fsspec==2021.5.0
future==0.18.2
graphviz==0.17
h2o==3.34.0.1
HeapDict==1.0.1
idna==2.10
imagesize==1.2.0
imbalanced-learn==0.8.0
importlib-metadata==4.0.1
iniconfig==1.1.1
ipykernel==5.5.5
ipython==7.23.1
ipython-genutils==0.2.0
ipywidgets==7.6.3
isort==5.0.0
itsdangerous==2.0.1
jedi==0.18.0
Jinja2==2.11.3
jmespath==0.10.0
joblib==1.0.1
jsonschema==3.2.0
jupyter==1.0.0
jupyter-client==6.1.12
jupyter-console==6.4.0
jupyter-core==4.7.1
jupyterlab-pygments==0.1.2
jupyterlab-widgets==1.0.0
kaleido==0.2.1
kiwisolver==1.3.1
kombu==5.1.0
lazy-object-proxy==1.6.0
liac-arff==2.5.0
lightgbm==3.0.0
line-profiler==3.2.6
llvmlite==0.36.0
locket==0.2.1
MarkupSafe==1.1.1
matplotlib==3.4.3
matplotlib-inline==0.1.2
mccabe==0.6.1
minio==7.0.3
mistune==0.8.4
more-itertools==8.7.0
moto==2.0.10
msgpack==1.0.2
mypy-extensions==0.4.3
nbclient==0.5.3
nbconvert==6.0.7
nbformat==5.1.3
nbsphinx==0.8.5
nbval==0.9.3
nest-asyncio==1.5.1
networkx==2.5.1
nlp-primitives==1.1.0
nltk==3.6.2
notebook==6.4.0
numba==0.53.0
numexpr==2.7.3
numpy==1.20.3
openml==0.12.2
packaging==20.9
pandas==1.3.3
pandocfilters==1.4.3
parso==0.8.2
partd==1.2.0
pathspec==0.8.1
patsy==0.5.1
pexpect==4.8.0
pickleshare==0.7.5
Pillow==8.2.0
plotly==5.1.0
pluggy==0.13.1
pmdarima==1.8.0
prometheus-client==0.10.1
prompt-toolkit==3.0.18
psutil==5.8.0
psycopg2-binary==2.9.1
ptyprocess==0.7.0
py==1.10.0
pyaml==20.4.0
pyarrow==4.0.0
pycodestyle==2.5.0
pycparser==2.20
pydata-sphinx-theme==0.6.3
pydocstyle==6.1.1
pyflakes==2.1.1
Pygments==2.9.0
pyparsing==2.4.7
pyrsistent==0.17.3
pytest==6.0.1
pytest-cov==2.10.1
pytest-forked==1.3.0
pytest-xdist==2.1.0
python-dateutil==2.8.1
pytz==2021.1
PyYAML==5.4
pyzmq==21.0.2
qtconsole==5.1.1
QtPy==1.11.2
regex==2021.4.4
requests==2.25.1
requirements-parser==0.2.0
resolvelib==0.5.4
responses==0.13.3
retrying==1.3.3
rgf-python==3.11.0
ruamel.yaml==0.17.4
ruamel.yaml.clib==0.2.2
s3fs==0.3.3
s3transfer==0.4.2
scikit-learn==0.24.2
scikit-optimize==0.8.1
scipy==1.6.3
seaborn==0.11.1
Send2Trash==1.5.0
shap==0.39.0
six==1.16.0
sktime==0.8.0
slack-sdk==3.8.0
slicer==0.0.7
snakeviz==2.1.0
snowballstemmer==2.1.0
sortedcontainers==2.4.0
soupsieve==2.2.1
Sphinx==3.5.4
sphinx-autoapi==1.8.2
sphinxcontrib-applehelp==1.0.2
sphinxcontrib-devhelp==1.0.2
sphinxcontrib-htmlhelp==1.0.3
sphinxcontrib-jsmath==1.0.1
sphinxcontrib-qthelp==1.0.3
sphinxcontrib-serializinghtml==1.1.4
sshpubkeys==3.3.1
statsmodels==0.12.2
tables==3.6.1
tabulate==0.8.9
tblib==1.7.0
tenacity==7.0.0
terminado==0.10.0
testpath==0.5.0
text-unidecode==1.3
texttable==1.6.4
threadpoolctl==2.1.0
thrift==0.13.0
toml==0.10.2
toolz==0.11.1
tornado==6.1
tqdm==4.60.0
traitlets==5.0.5
typed-ast==1.4.3
typing-extensions==3.10.0.0
Unidecode==1.2.0
untokenize==0.1.1
urllib3==1.26.4
vine==5.0.0
vowpalwabbit==8.11.0
wcwidth==0.2.5
webencodings==0.5.1
websocket-client==1.1.0
Werkzeug==2.0.1
widgetsnbextension==3.5.1
woodwork==0.8.0
wrapt==1.12.1
xgboost==1.3.3
xmltodict==0.12.0
zict==2.0.0
zipp==3.4.1
sonichi commented 3 years ago

The lightgbm version in this working env is 3.0.0. The failing env has version 3.2.1. Could that be the cause?

angela97lin commented 3 years ago

Still no luck 😬

(flaml) angela.lin ~/Desktop -  $ pip install lightgbm==3.0.0
Requirement already satisfied: lightgbm==3.0.0 in ./flaml/lib/python3.7/site-packages (3.0.0)
Requirement already satisfied: scipy in ./flaml/lib/python3.7/site-packages (from lightgbm==3.0.0) (1.7.1)
Requirement already satisfied: numpy in ./flaml/lib/python3.7/site-packages (from lightgbm==3.0.0) (1.21.2)
Requirement already satisfied: scikit-learn!=0.22.0 in ./flaml/lib/python3.7/site-packages (from lightgbm==3.0.0) (1.0)
Requirement already satisfied: joblib>=0.11 in ./flaml/lib/python3.7/site-packages (from scikit-learn!=0.22.0->lightgbm==3.0.0) (1.0.1)
Requirement already satisfied: threadpoolctl>=2.0.0 in ./flaml/lib/python3.7/site-packages (from scikit-learn!=0.22.0->lightgbm==3.0.0) (2.2.0)
(flaml) angela.lin ~/Desktop -  $ python3 innodays_flaml.py
[flaml.automl: 09-28 15:46:37] {1432} INFO - Evaluation method: cv
[flaml.automl: 09-28 15:46:37] {1478} INFO - Minimizing error metric: 1-roc_auc
[flaml.automl: 09-28 15:46:37] {1515} INFO - List of ML learners in AutoML Run: ['lgbm', 'rf', 'catboost', 'xgboost', 'extra_tree', 'lrl1']
[flaml.automl: 09-28 15:46:37] {1747} INFO - iteration 0, current learner lgbm
Fatal Python error: Fatal Python error: Segmentation fault: 11
(flaml) angela.lin~/Desktop -  $
sonichi commented 3 years ago

We can add some debugging info to the output and then you can run it and confirm whether the error happens right before lightgbm training. Would you like to do that?

angela97lin commented 3 years ago

Sure! If this is helpful, installing FLAML in an editable mode and setting a breakpoint shows:

> /Users/angela.lin/Desktop/FLAML/flaml/tune/tune.py(386)run()
-> result = training_function(trial_to_run.config)
(Pdb) training_function
functools.partial(<function AutoMLState._compute_with_config_base at 0x1339ff050>, <flaml.automl.AutoMLState object at 0x106923590>, 'lgbm')
(Pdb) (trial_to_run.config)
{'n_estimators': 4, 'num_leaves': 4, 'min_child_samples': 20, 'learning_rate': 0.09999999999999995, 'log_max_bin': 8, 'colsample_bytree': 1.0, 'reg_alpha': 0.0009765625, 'reg_lambda': 1.0}

Stepping through, it seg faults when running compute_estimator in _compute_with_config_base.

sonichi commented 3 years ago

I added the debugging mode in the branch msg. If you install that branch of FLAML with -e, you can pass verbose=4 to AutoML.fit() and see debugging info. If you see

[flaml.automl: 10-01 08:59:50] {98} DEBUG - flaml.model - LGBMClassifier(learning_rate=0.26770501231052046, max_bin=127,
               min_child_samples=12, n_estimators=4, n_jobs=1, num_leaves=4,
               reg_alpha=0.001348364934537134, reg_lambda=1.4442580148221913,
               verbose=-1) fit started
[flaml.automl: 10-01 08:59:50] {101} DEBUG - flaml.model - LGBMClassifier(learning_rate=0.26770501231052046, max_bin=127,
               min_child_samples=12, n_estimators=4, n_jobs=1, num_leaves=4,
               reg_alpha=0.001348364934537134, reg_lambda=1.4442580148221913,
               verbose=-1) fit finished

before the seg fault, then the lgbm fit succeeds. If you only see the first msg but not the second one, then the seg fault happens during fit.

sonichi commented 3 years ago

@angela97lin In #243 another user reported a similar issue in MacOS and confirmed that lgbm's fit() fails without flaml. Could you test lightgbm in your env without flaml too?

sonichi commented 3 years ago

conda package of flaml is available now and it's supposed to resolve this issue: https://github.com/microsoft/FLAML/issues/194#issuecomment-942131636

angela97lin commented 3 years ago

@sonichi Sorry for the late response! I confirmed that lightgbm segfaults:

import lightgbm as lgbm
lgbm_classifier = lgbm.sklearn.LGBMClassifier()
lgbm_classifier.fit(X, y)

>>> Segmentation fault: 11

I was able to successfully run my example using conda! Just a side note though, I wasn't able to use catboost without installing it separately. Is it not listed as a dependency?

conda list gives me:

# Name                    Version                   Build  Channel
_py-xgboost-mutex         2.0                       cpu_0    conda-forge
brotlipy                  0.7.0           py39h89e85a6_1001    conda-forge
ca-certificates           2021.10.8            h033912b_0    conda-forge
certifi                   2021.10.8        py39h6e9494a_0    conda-forge
cffi                      1.14.6           py39he338e87_1    conda-forge
chardet                   4.0.0            py39h6e9494a_1    conda-forge
charset-normalizer        2.0.0              pyhd8ed1ab_0    conda-forge
colorama                  0.4.4              pyh9f0ad1d_0    conda-forge
conda                     4.10.3           py39h6e9494a_2    conda-forge
conda-package-handling    1.7.3            py39h89e85a6_0    conda-forge
cryptography              3.4.8            py39ha2c9959_0    conda-forge
flaml                     0.6.7              pyhd8ed1ab_1    conda-forge
idna                      3.1                pyhd3deb0d_0    conda-forge
joblib                    1.1.0              pyhd8ed1ab_0    conda-forge
libblas                   3.9.0           11_osx64_openblas    conda-forge
libcblas                  3.9.0           11_osx64_openblas    conda-forge
libcxx                    12.0.1               habf9029_0    conda-forge
libffi                    3.4.2                he49afe7_4    conda-forge
libgfortran               5.0.0           9_3_0_h6c81a4c_23    conda-forge
libgfortran5              9.3.0               h6c81a4c_23    conda-forge
liblapack                 3.9.0           11_osx64_openblas    conda-forge
libopenblas               0.3.17          openmp_h3351f45_1    conda-forge
libxgboost                1.3.3                he49afe7_2    conda-forge
libzlib                   1.2.11            h9173be1_1013    conda-forge
lightgbm                  3.2.1            py39h9fcab8e_0    conda-forge
llvm-openmp               12.0.1               hda6cdc1_1    conda-forge
ncurses                   6.2                  h2e338ed_4    conda-forge
numpy                     1.21.2           py39h7eed0ac_0    conda-forge
openssl                   1.1.1l               h0d85af4_0    conda-forge
pandas                    1.3.3            py39h4d6be9b_0    conda-forge
pip                       21.3               pyhd8ed1ab_0    conda-forge
py-xgboost                1.3.3            py39h6e9494a_2    conda-forge
pycosat                   0.6.3           py39h89e85a6_1006    conda-forge
pycparser                 2.20               pyh9f0ad1d_2    conda-forge
pyopenssl                 21.0.0             pyhd8ed1ab_0    conda-forge
pysocks                   1.7.1            py39h6e9494a_3    conda-forge
python                    3.9.7           h1248fe1_3_cpython    conda-forge
python-dateutil           2.8.2              pyhd8ed1ab_0    conda-forge
python.app                1.3              py39h89e85a6_5    conda-forge
python_abi                3.9                      2_cp39    conda-forge
pytz                      2021.3             pyhd8ed1ab_0    conda-forge
readline                  8.1                  h05e3726_0    conda-forge
requests                  2.26.0             pyhd8ed1ab_0    conda-forge
ruamel_yaml               0.15.80         py39h89e85a6_1004    conda-forge
scikit-learn              1.0              py39hd2caeff_1    conda-forge
scipy                     1.7.1            py39h056f1c0_0    conda-forge
setuptools                58.2.0           py39h6e9494a_0    conda-forge
six                       1.16.0             pyh6c4a22f_0    conda-forge
sqlite                    3.36.0               h23a322b_2    conda-forge
threadpoolctl             3.0.0              pyh8a188c0_0    conda-forge
tk                        8.6.11               h5dbffcc_1    conda-forge
tqdm                      4.62.3             pyhd8ed1ab_0    conda-forge
tzdata                    2021c                he74cb21_0    conda-forge
urllib3                   1.26.7             pyhd8ed1ab_0    conda-forge
wheel                     0.37.0             pyhd8ed1ab_1    conda-forge
xgboost                   1.3.3            py39h6e9494a_2    conda-forge
xz                        5.2.5                haf1e3a3_1    conda-forge
yaml                      0.2.5                haf1e3a3_0    conda-forge
zlib                      1.2.11            h9173be1_1013    conda-forge
sonichi commented 3 years ago

@angela97lin great that it works now. catboost is not available in conda, so we have to remove it from the dependency to make the conda package. BTW, it'll be nice if you could share what application you use flaml for. Feel free to chat on gitter.

angela97lin commented 3 years ago

@sonichi Got it. FWIW, we make our conda package with catboost as a dependency via conda-forge: https://anaconda.org/conda-forge/catboost, could be interesting to look into?

Closing this issue though since the original issue is resolved. Thanks again!