vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
31.08k stars 4.72k forks source link

install error:pip install -e . #3267

Closed njhouse365 closed 1 day ago

njhouse365 commented 8 months ago

Obtaining file:///home/house365ai/xxm/vllm2 Installing build dependencies ... done Checking if build backend supports build_editable ... done Getting requirements to build editable ... error error: subprocess-exited-with-error

× Getting requirements to build editable did not run successfully. │ exit code: 1 ╰─> [22 lines of output] /tmp/pip-build-env-wmgabzo_/overlay/lib/python3.10/site-packages/torch/nn/modules/transformer.py:20: UserWarning: Failed to initialize NumPy: No module named 'numpy' (Triggered internally at ../torch/csrc/utils/tensor_numpy.cpp:84.) device: torch.device = torch.device(torch._C._get_default_device()), # torch.device('cpu'), /home/house365ai/anaconda3/envs/vllm2/bin/python: No module named pip Traceback (most recent call last): File "/home/house365ai/anaconda3/envs/vllm2/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in main() File "/home/house365ai/anaconda3/envs/vllm2/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main json_out['return_val'] = hook(**hook_input['kwargs']) File "/home/house365ai/anaconda3/envs/vllm2/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 132, in get_requires_for_build_editable return hook(configsettings) File "/tmp/pip-build-env-wmgabzo/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 448, in get_requires_for_build_editable return self.get_requires_for_build_wheel(configsettings) File "/tmp/pip-build-env-wmgabzo/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 325, in get_requires_for_build_wheel return self._get_build_requires(configsettings, requirements=['wheel']) File "/tmp/pip-build-env-wmgabzo/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 295, in _get_build_requires self.runsetup() File "/tmp/pip-build-env-wmgabzo/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 311, in run_setup exec(code, locals()) File "", line 335, in File "/home/house365ai/anaconda3/envs/vllm2/lib/python3.10/subprocess.py", line 369, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['/home/house365ai/anaconda3/envs/vllm2/bin/python', '-m', 'pip', 'install', '-q', '--target=/home/house365ai/xxm/vllm2/vllm/thirdparty_files', 'einops', 'flash-attn==2.5.6', '--no-dependencies']' returned non-zero exit status 1. [end of output]

phymbert commented 8 months ago

Same here, reverting to release 82091b864af105dbe373353655dc9d8c0a6ba66f fixed the issue.

mgoin commented 8 months ago

Should be addressed by https://github.com/vllm-project/vllm/pull/3269 which has landed on main

njhouse365 commented 8 months ago

thanks eveyone

jinggaizi commented 8 months ago

@njhouse365 can you solve this problem? i have a similar problem: error: subprocess-exited-with-error

× Building editable for vllm (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [605 lines of output] /tmp/pip-build-env-tfo2bdmm/overlay/local/lib/python3.10/dist-packages/torch/nn/modules/transformer.py:20: UserWarning: Failed to initialize NumPy: No module named 'numpy' (Triggered internally at ../torch/csrc/utils/tensor_numpy.cpp:84.) device: torch.device = torch.device(torch._C._get_default_device()), # torch.device('cpu'),

:217: UserWarning: Unsupported CUDA architectures ({'6.1', '5.2', '8.7', '7.2', '6.0'}) are excluded from the `TORCH_CUDA_ARCH_LIST` env variable (5.2 6.0 6.1 7.0 7.2 7.5 8.0 8.6 8.7 9.0+PTX). Supported CUDA architectures are: {'8.9+PTX', '7.0+PTX', '7.5+PTX', '9.0+PTX', '9.0', '8.0', '7.0', '8.6+PTX', '8.0+PTX', '8.9', '7.5', '8.6'}. running editable_wheel creating /tmp/pip-wheel-eb0cs86q/.tmp-uxgt0dtu/vllm.egg-info writing /tmp/pip-wheel-eb0cs86q/.tmp-uxgt0dtu/vllm.egg-info/PKG-INFO writing dependency_links to /tmp/pip-wheel-eb0cs86q/.tmp-uxgt0dtu/vllm.egg-info/dependency_links.txt writing requirements to /tmp/pip-wheel-eb0cs86q/.tmp-uxgt0dtu/vllm.egg-info/requires.txt writing top-level names to /tmp/pip-wheel-eb0cs86q/.tmp-uxgt0dtu/vllm.egg-info/top_level.txt writing manifest file '/tmp/pip-wheel-eb0cs86q/.tmp-uxgt0dtu/vllm.egg-info/SOURCES.txt' reading manifest file '/tmp/pip-wheel-eb0cs86q/.tmp-uxgt0dtu/vllm.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' adding license file 'LICENSE' writing manifest file '/tmp/pip-wheel-eb0cs86q/.tmp-uxgt0dtu/vllm.egg-info/SOURCES.txt' creating '/tmp/pip-wheel-eb0cs86q/.tmp-uxgt0dtu/vllm-0.3.3+cu122.dist-info' creating /tmp/pip-wheel-eb0cs86q/.tmp-uxgt0dtu/vllm-0.3.3+cu122.dist-info/WHEEL running build_py running build_ext /tmp/pip-build-env-tfo2bdmm/overlay/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py:414: UserWarning: The detected CUDA version (12.2) has a minor version mismatch with the version that was used to compile PyTorch (12.1). Most likely this shouldn't be a problem. warnings.warn(CUDA_MISMATCH_WARN.format(cuda_str_version, torch.version.cuda)) /tmp/pip-build-env-tfo2bdmm/overlay/local/lib/python3.10/dist-packages/torch/utils/cpp_extension.py:424: UserWarning: There are no x86_64-linux-gnu-g++ version bounds defined for CUDA version 12.2 my environment is docker nvcr.io/nvidia/pytorch:23.10-py3
jinggaizi commented 8 months ago

my graphics card is nvidia T4

jinggaizi commented 8 months ago

i install vllm in nvidia A10, it's work

youkaichao commented 8 months ago

@jinggaizi can you try pip install -vvv -e . to give more error information?

yjsunn commented 6 months ago

I have same error here, how do you solve it?

github-actions[bot] commented 1 month ago

This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!

github-actions[bot] commented 1 day ago

This issue has been automatically closed due to inactivity. Please feel free to reopen if you feel it is still relevant. Thank you!