While running finetune_lora.sh on colab, i encountered the following issue.
Traceback (most recent call last):
File "/content/mPLUG-Owl/mPLUG-Owl2/mplug_owl2/train/llama_flash_attn_monkey_patch.py", line 10, in <module>
from flash_attn.flash_attn_interface import flash_attn_unpadded_qkvpacked_func
File "/usr/local/lib/python3.10/dist-packages/flash_attn/__init__.py", line 3, in <module>
from flash_attn.flash_attn_interface import (
File "/usr/local/lib/python3.10/dist-packages/flash_attn/flash_attn_interface.py", line 10, in <module>
import flash_attn_2_cuda as flash_attn_cuda
ImportError: /usr/local/lib/python3.10/dist-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/content/mPLUG-Owl/mPLUG-Owl2/mplug_owl2/train/train_mem.py", line 7, in <module>
from mplug_owl2.train.llama_flash_attn_monkey_patch import replace_llama_attn_with_flash_attn
File "/content/mPLUG-Owl/mPLUG-Owl2/mplug_owl2/train/llama_flash_attn_monkey_patch.py", line 12, in <module>
from flash_attn.flash_attn_interface import flash_attn_varlen_qkvpacked_func as flash_attn_unpadded_qkvpacked_func
File "/usr/local/lib/python3.10/dist-packages/flash_attn/__init__.py", line 3, in <module>
from flash_attn.flash_attn_interface import (
File "/usr/local/lib/python3.10/dist-packages/flash_attn/flash_attn_interface.py", line 10, in <module>
import flash_attn_2_cuda as flash_attn_cuda
I am using the following pytorch version - 2.0.1+cu117
While running finetune_lora.sh on colab, i encountered the following issue.
I am using the following pytorch version - 2.0.1+cu117
Can anyone help with this? Thanks!