PKU-DAIR / open-box

Generalized and Efficient Blackbox Optimization System
https://open-box.readthedocs.io
Other
380 stars 53 forks source link

Can't install or compile in Ubuntu 23.10 #94

Open aivuk opened 6 months ago

aivuk commented 6 months ago

Describe the bug Whenever I try to compile it or install via pip install there is a bunch of errors compiling sklearn.

To Reproduce Steps to reproduce the behavior:

  1. pip install openbox

or:

  1. git clone https://github.com/PKU-DAIR/open-box.git
  2. cd open-box
  3. pip install -e .

Expected behavior Compile the package without errors

Outputs and Logs

The above exception was the direct cause of the following exception:

      Traceback (most recent call last):
        File "/home/aivuk/open-box/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
      [ 1/55] Cythonizing sklearn/__check_build/_check_build.pyx
      [ 2/55] Cythonizing sklearn/_isotonic.pyx
      [ 3/55] Cythonizing sklearn/cluster/_dbscan_inner.pyx
      [ 4/55] Cythonizing sklearn/cluster/_hierarchical_fast.pyx
      [ 5/55] Cythonizing sklearn/cluster/_k_means_common.pyx
      [ 6/55] Cythonizing sklearn/cluster/_k_means_elkan.pyx
      [ 7/55] Cythonizing sklearn/cluster/_k_means_lloyd.pyx
      [ 8/55] Cythonizing sklearn/cluster/_k_means_minibatch.pyx
      [ 9/55] Cythonizing sklearn/datasets/_svmlight_format_fast.pyx
      [10/55] Cythonizing sklearn/decomposition/_cdnmf_fast.pyx
      [11/55] Cythonizing sklearn/decomposition/_online_lda_fast.pyx
      [12/55] Cythonizing sklearn/ensemble/_gradient_boosting.pyx
      [13/55] Cythonizing sklearn/ensemble/_hist_gradient_boosting/_binning.pyx
      [14/55] Cythonizing sklearn/ensemble/_hist_gradient_boosting/_bitset.pyx
      [15/55] Cythonizing sklearn/ensemble/_hist_gradient_boosting/_gradient_boosting.pyx
      [16/55] Cythonizing sklearn/ensemble/_hist_gradient_boosting/_loss.pyx
      [17/55] Cythonizing sklearn/ensemble/_hist_gradient_boosting/_predictor.pyx
      [18/55] Cythonizing sklearn/ensemble/_hist_gradient_boosting/common.pyx
      [19/55] Cythonizing sklearn/ensemble/_hist_gradient_boosting/histogram.pyx
      [20/55] Cythonizing sklearn/ensemble/_hist_gradient_boosting/splitting.pyx
      [21/55] Cythonizing sklearn/ensemble/_hist_gradient_boosting/utils.pyx
      [22/55] Cythonizing sklearn/feature_extraction/_hashing_fast.pyx
      [23/55] Cythonizing sklearn/linear_model/_cd_fast.pyx
      [24/55] Cythonizing sklearn/linear_model/_sag_fast.pyx
      [25/55] Cythonizing sklearn/linear_model/_sgd_fast.pyx
      [26/55] Cythonizing sklearn/manifold/_barnes_hut_tsne.pyx
      [27/55] Cythonizing sklearn/manifold/_utils.pyx
      [28/55] Cythonizing sklearn/metrics/_dist_metrics.pyx
      [29/55] Cythonizing sklearn/metrics/_pairwise_fast.pyx
      [30/55] Cythonizing sklearn/metrics/cluster/_expected_mutual_info_fast.pyx
      [31/55] Cythonizing sklearn/neighbors/_ball_tree.pyx
      [32/55] Cythonizing sklearn/neighbors/_kd_tree.pyx
      [33/55] Cythonizing sklearn/neighbors/_partition_nodes.pyx
      [34/55] Cythonizing sklearn/neighbors/_quad_tree.pyx
      [35/55] Cythonizing sklearn/preprocessing/_csr_polynomial_expansion.pyx
      [36/55] Cythonizing sklearn/svm/_liblinear.pyx
      [37/55] Cythonizing sklearn/svm/_libsvm.pyx
      [38/55] Cythonizing sklearn/svm/_libsvm_sparse.pyx
      [39/55] Cythonizing sklearn/svm/_newrand.pyx
      [40/55] Cythonizing sklearn/tree/_criterion.pyx
      [41/55] Cythonizing sklearn/tree/_splitter.pyx
      [42/55] Cythonizing sklearn/tree/_tree.pyx
      [43/55] Cythonizing sklearn/tree/_utils.pyx
      [44/55] Cythonizing sklearn/utils/_cython_blas.pyx
      [45/55] Cythonizing sklearn/utils/_fast_dict.pyx
      [46/55] Cythonizing sklearn/utils/_logistic_sigmoid.pyx
      [47/55] Cythonizing sklearn/utils/_openmp_helpers.pyx
      [48/55] Cythonizing sklearn/utils/_random.pyx
      [49/55] Cythonizing sklearn/utils/_readonly_array_wrapper.pyx
      [50/55] Cythonizing sklearn/utils/_seq_dataset.pyx
      [51/55] Cythonizing sklearn/utils/_typedefs.pyx
      [52/55] Cythonizing sklearn/utils/_weight_vector.pyx
      [53/55] Cythonizing sklearn/utils/arrayfuncs.pyx
      [54/55] Cythonizing sklearn/utils/murmurhash.pyx
      [55/55] Cythonizing sklearn/utils/sparsefuncs_fast.pyx
          main()
        File "/home/aivuk/open-box/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/home/aivuk/open-box/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 149, in prepare_metadata_for_build_wheel
          return hook(metadata_directory, config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-c3an5qpm/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 174, in prepare_metadata_for_build_wheel
          self.run_setup()
        File "/tmp/pip-build-env-c3an5qpm/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 268, in run_setup
          self).run_setup(setup_script=setup_script)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-c3an5qpm/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 158, in run_setup
          exec(compile(code, __file__, 'exec'), locals())
        File "setup.py", line 319, in <module>
          setup_package()
        File "setup.py", line 315, in setup_package
          setup(**metadata)
        File "/tmp/pip-build-env-c3an5qpm/overlay/lib/python3.11/site-packages/numpy/distutils/core.py", line 135, in setup
          config = configuration()
                   ^^^^^^^^^^^^^^^
        File "setup.py", line 201, in configuration
          config.add_subpackage("sklearn")
        File "/tmp/pip-build-env-c3an5qpm/overlay/lib/python3.11/site-packages/numpy/distutils/misc_util.py", line 1050, in add_subpackage
          config_list = self.get_subpackage(subpackage_name, subpackage_path,
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-c3an5qpm/overlay/lib/python3.11/site-packages/numpy/distutils/misc_util.py", line 1016, in get_subpackage
          config = self._get_configuration_from_setup_py(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-c3an5qpm/overlay/lib/python3.11/site-packages/numpy/distutils/misc_util.py", line 958, in _get_configuration_from_setup_py
          config = setup_module.configuration(*args)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-install-tg1fd3zz/scikit-learn_3114b9eda78e49099f371788c2af80df/sklearn/setup.py", line 85, in configuration
          cythonize_extensions(top_path, config)
        File "/tmp/pip-install-tg1fd3zz/scikit-learn_3114b9eda78e49099f371788c2af80df/sklearn/_build_utils/__init__.py", line 78, in cythonize_extensions
          config.ext_modules = cythonize(
                               ^^^^^^^^^^
        File "/tmp/pip-build-env-c3an5qpm/overlay/lib/python3.11/site-packages/Cython/Build/Dependencies.py", line 1145, in cythonize
          result.get(99999)  # seconds
          ^^^^^^^^^^^^^^^^^
        File "/usr/lib/python3.11/multiprocessing/pool.py", line 774, in get
          raise self._value
      Cython.Compiler.Errors.CompileError: sklearn/ensemble/_hist_gradient_boosting/splitting.pyx
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

Additional context

Python 3.11.6

jhj0411jhj commented 6 months ago

Please use Python 3.8, 3.9, or 3.10 at present. We are working on supporting Python 3.11, but it may take a while.

Xbc-gressor commented 4 months ago

We've supported Python 3.11 recently. You can try again! Please feel free to contact us if new problems arise.