Open tlrmchlsmth opened 3 months ago
cc @zheng-ningxin
same to me
Hello @tlrmchlsmth @Rainlin007 , sorry for the late reply. Could you please let me know your torch.version and torch.version.cuda versions?
torch is 2.4 and torch.version.cuda
is 12.1
If it helps, here is my pip freeze
output after running pip install torch packaging wheel numpy
and before running pip install byte-flux
filelock==3.15.4
fsspec==2024.6.1
Jinja2==3.1.4
MarkupSafe==2.1.5
mpmath==1.3.0
networkx==3.3
numpy==2.1.0
nvidia-cublas-cu12==12.1.3.1
nvidia-cuda-cupti-cu12==12.1.105
nvidia-cuda-nvrtc-cu12==12.1.105
nvidia-cuda-runtime-cu12==12.1.105
nvidia-cudnn-cu12==9.1.0.70
nvidia-cufft-cu12==11.0.2.54
nvidia-curand-cu12==10.3.2.106
nvidia-cusolver-cu12==11.4.5.107
nvidia-cusparse-cu12==12.1.0.106
nvidia-nccl-cu12==2.20.5
nvidia-nvjitlink-cu12==12.6.20
nvidia-nvtx-cu12==12.1.105
packaging==24.1
sympy==1.13.2
torch==2.4.0
triton==3.0.0
typing_extensions==4.12.2
This is a blocker on https://github.com/vllm-project/vllm/pull/5917.
seems no torch 2.4.0 wheel package compiled. @zheng-ningxin
@zheng-ningxin can we prioritize this issue since it is blocking @tlrmchlsmth 's PR to merge? I think we just missed the torch 2.4 right ?
torch is 2.4 and
torch.version.cuda
is 12.1If it helps, here is my
pip freeze
output after runningpip install torch packaging wheel numpy
and before runningpip install byte-flux
filelock==3.15.4 fsspec==2024.6.1 Jinja2==3.1.4 MarkupSafe==2.1.5 mpmath==1.3.0 networkx==3.3 numpy==2.1.0 nvidia-cublas-cu12==12.1.3.1 nvidia-cuda-cupti-cu12==12.1.105 nvidia-cuda-nvrtc-cu12==12.1.105 nvidia-cuda-runtime-cu12==12.1.105 nvidia-cudnn-cu12==9.1.0.70 nvidia-cufft-cu12==11.0.2.54 nvidia-curand-cu12==10.3.2.106 nvidia-cusolver-cu12==11.4.5.107 nvidia-cusparse-cu12==12.1.0.106 nvidia-nccl-cu12==2.20.5 nvidia-nvjitlink-cu12==12.6.20 nvidia-nvtx-cu12==12.1.105 packaging==24.1 sympy==1.13.2 torch==2.4.0 triton==3.0.0 typing_extensions==4.12.2
This is a blocker on vllm-project/vllm#5917.
Sorry for the late reply, @tlrmchlsmth . We will take a look and fix it to unblock you. Thanks.
@tlrmchlsmth Please check the new release here https://github.com/bytedance/flux/releases/tag/v1.0.3 made by @zheng-ningxin that include torch 2.4 and cuda 12.1.
Describe the bug I'm unable to install byte-flux from pypi.
To Reproduce
In a fresh venv, run:
and then
Sidenote: It should be possible to install flux without manually installing the requirements.
Stack trace/logs