Hi,
I completed all of the 'Getting Started' instructions as instructed and have Ubuntu 22.04, NVIDIA RTX 3090, and am using CUDA 12.1. I installed torch 2.3.0+cu121 and installed all of the other requirements as instructed. However, I am getting an issue with my flash_attn installation and usage. I have tried many versions of flash_attn and many different ways of installing.
However, I get this error:
PuzzleAvatar/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZNK3c105Error4whatEv
I saw on the flash_attn GitHub that others have faced this issue and it may be due to the CUDA and torch versions being too high. But, I know that you recommended CUDA 12.1, so can you please let me know how you installed flash_attn to work with the other dependencies. Please let me know. Thanks for the help!
Hi, I completed all of the 'Getting Started' instructions as instructed and have Ubuntu 22.04, NVIDIA RTX 3090, and am using CUDA 12.1. I installed torch 2.3.0+cu121 and installed all of the other requirements as instructed. However, I am getting an issue with my flash_attn installation and usage. I have tried many versions of flash_attn and many different ways of installing.
However, I get this error: PuzzleAvatar/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZNK3c105Error4whatEv
I saw on the flash_attn GitHub that others have faced this issue and it may be due to the CUDA and torch versions being too high. But, I know that you recommended CUDA 12.1, so can you please let me know how you installed flash_attn to work with the other dependencies. Please let me know. Thanks for the help!