facebookresearch / xformers

Hackable and optimized Transformers building blocks, supporting a composable construction.
https://facebookresearch.github.io/xformers/
Other
8.33k stars 585 forks source link

🚀 Precompiled xFormers for CUDA 12.4 and PyTorch 2.4 Compatibility #1079

Open sashaok123 opened 1 month ago

sashaok123 commented 1 month ago

Feature A precompiled version of xFormers that is compatible with CUDA 12.4 and PyTorch 2.4 .

Motivation Many users, including those working with projects like Forge, are now transitioning to newer versions of CUDA and PyTorch. However, the lack of precompiled xFormers binaries for CUDA 12.4 and PyTorch 2.4 creates a barrier for these users. Building from source is not always feasible, especially in restricted environments or for users who may not be comfortable with the complexities of manual builds.

Pitch Providing a precompiled xFormers binary compatible with CUDA 12.4 and PyTorch 2.4 would simplify the installation process for a wide range of users. This would reduce the likelihood of build errors and allow users to focus on their core projects rather than troubleshooting build environments. It would also ensure better integration with projects like Forge, where xFormers is a key dependency.

Alternatives Currently, the only alternative is to build xFormers from source, which can be time-consuming and error-prone, particularly for users unfamiliar with the process or those working in environments with limited permissions.

Additional Context The availability of precompiled binaries would greatly benefit the community, especially as CUDA 12.4 and PyTorch 2.4 become more widely adopted. This request is in alignment with the needs of ongoing projects and would enhance the usability of xFormers in modern machine learning pipelines.

danthe3rd commented 1 month ago

Hi, We already have binaries pre-compiled for CUDA 12.1. You can install them with this command:

pip3 install -U xformers --index-url https://download.pytorch.org/whl/cu121
sashaok123 commented 1 month ago

We already have binaries pre-compiled

I'm sorry, I corrected the description. CUDA 12.4 and PyTorch 2.4

sashaok123 commented 1 month ago

Forge works on newer ones.

lw commented 1 month ago

Providing builds for multiple CUDA versions comes at a cost, which is why at the moment we only support two versions (11.8 and 12.1). Could you explain why you need CUDA 12.4 specifically and cannot use 12.1?

I believe PyTorch plans to drop support for CUDA 11.8 shortly (https://github.com/pytorch/pytorch/issues/123456), and when they do we will follow, and at that point we'll happily support newer versions of CUDA.

sashaok123 commented 1 month ago

Could you explain why you need CUDA 12.4 specifically and cannot use 12.1?

illyasviel has globally updated Forge to work with a variety of image generation models. It also includes a new version of Gradio and, as I understand it, a new approach to working with models.

And now most people who have updated get the error "TypeError: 'NoneType' object is not iterable" until xformers is removed from the environment.

illyasviel recommends the following:

But by default, when cloning the repository, CUDA 12.4 + Pytorch 2.4 is installed https://github.com/lllyasviel/stable-diffusion-webui-forge

Zhengchai commented 1 month ago

Vote for this request please i.e. to implement CUDA 12.4, Pytorch 2.4. I encountered same error in our recent projects.

venv\Lib\site-packages\xformers\ops\fmha\dispatch.py", line 55, in _run_priority_list raise NotImplementedError(msg) NotImplementedError: No operator found for memory_efficient_attention_forward with inputs: query : shape=(1, 257, 6, 64) (torch.float32) key : shape=(1, 257, 6, 64) (torch.float32) value : shape=(1, 257, 6, 64) (torch.float32) attn_bias : <class 'NoneType'> p : 0.0 decoderF is not supported because: device=cpu (supported: {'cuda'}) attn_bias type is <class 'NoneType'>

pandayummy commented 3 weeks ago

xFormers for CUDA 12.4 and PyTorch 2.4 needed.

\ComfyUI_windows_portable\python_embeded\Lib\site-packages\xformers\ops\fmha\flash.py:210: FutureWarning: `torch.library.impl_abstract` was renamed to `torch.library.register_fake`. Please use that instead; we will remove `torch.library.impl_abstract` in a future version of PyTorch.

AugmentedRealityCat commented 2 weeks ago

Looks like the Xformers team just granted your wish !

https://github.com/facebookresearch/xformers/actions/runs/10559887009

xformers-0.0.28.dev893+cu124-cp311-cp311-win_amd64.whland other python wheels for various versions of python ( 3.8 to 3.12 ) and cuda ( 118 to 124 ), and there are rocm and ubuntu options as well but I haven't tried those.

pandayummy commented 2 weeks ago

image

Cool!

RaafaRB commented 1 week ago

Looks like the Xformers team just granted your wish !

https://github.com/facebookresearch/xformers/actions/runs/10559887009

xformers-0.0.28.dev893+cu124-cp311-cp311-win_amd64.whland other python wheels for various versions of python ( 3.8 to 3.12 ) and cuda ( 118 to 124 ), and there are rocm and ubuntu options as well but I haven't tried those.

How do I install that specific version? I cant find it at https://download.pytorch.org/whl/xformers/. Sorry if this is an ignorant question.

sashaok123 commented 1 week ago

How do I install that specific version?

It has a universal version if you install the latest build. pip install xformers==0.0.28.dev895

Yaruze66 commented 6 days ago

Will precompiled xFormers be available for PyTorch 2.4.1?

pandayummy commented 6 days ago

Will precompiled xFormers be available for PyTorch 2.4.1?

+1