pengHTYX / Era3D

GNU Affero General Public License v3.0
443 stars 20 forks source link

xformers error #20

Open shirokalu opened 2 weeks ago

shirokalu commented 2 weeks ago

hi, i pip install xformers-0.0.23.post1-cp39-cp39-manylinux2014_x86_64.whl ,but i meet a error like:xFormers wasn't build with CUDA support image

pengHTYX commented 2 weeks ago

@shirokalu , hi, can you provide more running information?

kungfooman commented 2 weeks ago
wget https://download.pytorch.org/whl/cu118/xformers-0.0.26.post1%2Bcu118-cp39-cp39-manylinux2014_x86_64.whl#sha256=8e862ec2507d2df58b4f1320043c4e5c1496a1c2c9e5c446392b9c9d6bd6ceb7
pip install xformers-0.0.26.post1+cu118-cp39-cp39-manylinux2014_x86_64.whl

(the way I understand it is cu118 in the URL means CUDA support)

pengHTYX commented 2 weeks ago

It seems that xformers+cu118 was not installed successfully.

kungfooman commented 2 weeks ago

It seems that xformers+cu118 was not installed successfully.

That's right, xformers was installed, but without CUDA support.

One can use this command for testing: python -m xformers.info

Output should be something like:

Unable to find python bindings at /usr/local/dcgm/bindings/python3. No data will be captured.
xFormers 0.0.23.post1+cu118
memory_efficient_attention.cutlassF:               available
memory_efficient_attention.cutlassB:               available
memory_efficient_attention.decoderF:               available
memory_efficient_attention.flshattF@v2.3.6:        available
memory_efficient_attention.flshattB@v2.3.6:        available
memory_efficient_attention.smallkF:                available
memory_efficient_attention.smallkB:                available
memory_efficient_attention.tritonflashattF:        unavailable
memory_efficient_attention.tritonflashattB:        unavailable
memory_efficient_attention.triton_splitKF:         available
indexing.scaled_index_addF:                        available
indexing.scaled_index_addB:                        available
indexing.index_select:                             available
swiglu.dual_gemm_silu:                             available
swiglu.gemm_fused_operand_sum:                     available
swiglu.fused.p.cpp:                                available
is_triton_available:                               True
pytorch.version:                                   2.1.2+cu118
pytorch.cuda:                                      available
gpu.compute_capability:                            8.6
gpu.name:                                          NVIDIA GeForce RTX 3090
dcgm_profiler:                                     unavailable
build.info:                                        available
build.cuda_version:                                1108
build.python_version:                              3.9.18
build.torch_version:                               2.1.2+cu118
build.env.TORCH_CUDA_ARCH_LIST:                    5.0+PTX 6.0 6.1 7.0 7.5 8.0+PTX 9.0
build.env.XFORMERS_BUILD_TYPE:                     Release
build.env.XFORMERS_ENABLE_DEBUG_ASSERTIONS:        None
build.env.NVCC_FLAGS:                              None
build.env.XFORMERS_PACKAGE_FROM:                   wheel-v0.0.23.post1
build.nvcc_version:                                11.8.89
source.privacy:                                    open source

At least Era3D works for me :sweat_smile:

shirokalu commented 2 weeks ago

似乎 xformers+cu118 没有安装成功。

没错,安装了 xformers,但没有 CUDA 支持。

可以使用此命令进行测试:python -m xformers.info

输出应如下所示:

Unable to find python bindings at /usr/local/dcgm/bindings/python3. No data will be captured.
xFormers 0.0.23.post1+cu118
memory_efficient_attention.cutlassF:               available
memory_efficient_attention.cutlassB:               available
memory_efficient_attention.decoderF:               available
memory_efficient_attention.flshattF@v2.3.6:        available
memory_efficient_attention.flshattB@v2.3.6:        available
memory_efficient_attention.smallkF:                available
memory_efficient_attention.smallkB:                available
memory_efficient_attention.tritonflashattF:        unavailable
memory_efficient_attention.tritonflashattB:        unavailable
memory_efficient_attention.triton_splitKF:         available
indexing.scaled_index_addF:                        available
indexing.scaled_index_addB:                        available
indexing.index_select:                             available
swiglu.dual_gemm_silu:                             available
swiglu.gemm_fused_operand_sum:                     available
swiglu.fused.p.cpp:                                available
is_triton_available:                               True
pytorch.version:                                   2.1.2+cu118
pytorch.cuda:                                      available
gpu.compute_capability:                            8.6
gpu.name:                                          NVIDIA GeForce RTX 3090
dcgm_profiler:                                     unavailable
build.info:                                        available
build.cuda_version:                                1108
build.python_version:                              3.9.18
build.torch_version:                               2.1.2+cu118
build.env.TORCH_CUDA_ARCH_LIST:                    5.0+PTX 6.0 6.1 7.0 7.5 8.0+PTX 9.0
build.env.XFORMERS_BUILD_TYPE:                     Release
build.env.XFORMERS_ENABLE_DEBUG_ASSERTIONS:        None
build.env.NVCC_FLAGS:                              None
build.env.XFORMERS_PACKAGE_FROM:                   wheel-v0.0.23.post1
build.nvcc_version:                                11.8.89
source.privacy:                                    open source

至少 Era3D 对我😅有用

I tried to install xformers-0.0.26.post1+cu118-cp39-cp39-manylinux2014_x86_64, but the installation insisted that I install the dependency torch==2.3. I attempted to install it with the --no-deps option, but it still didn't work.

kungfooman commented 2 weeks ago

I tried to install xformers-0.0.26.post1+cu118-cp39-cp39-manylinux2014_x86_64, but the installation insisted that I install the dependency torch==2.3. I attempted to install it with the --no-deps option, but it still didn't work.

It also installed torch==2.3 for me, but it worked nicely when running python app.py (not this repo, but the Huggingface one).