Open umarbutler opened 3 weeks ago
Currently 12.3 is supported. This version can already support Flash Attention 2. You can give this version a try. There is no obvious difference between 12.4 and 12.3.
Currently 12.3 is supported. This version can already support Flash Attention 2. You can give this version a try. There is no obvious difference between 12.4 and 12.3.
The only problem is that PyTorch does not have wheels for CUDA 12.3...
需求描述 Feature Description
I would like to request support for CUDA 12.4, at least on Windows.
Currently, it is impossible to have the latest versions of PyTorch, Flash Attention 2 and PaddlePaddle installed at the same time.
The reason is that PyTorch requires CUDA 11.8, 12.1 or 12.4, Flash Attention 2 requires 12.XX and PaddlePaddle requires 11.2, 11.6, 11.7, 11.8, 12.0 or 12.3.
替代实现 Alternatives
No response