Open tcapelle opened 10 months ago
I have also tried RUN pip install flash-attn --no-build-isolation
but even if it installs, then it will raise
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/workspace/mixtral/simple_inference.py", line 7, in <module>
from transformers import AutoModelForCausalLM, AutoTokenizer, TrainingArguments
File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1372, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/usr/local/lib/python3.10/dist-packages/transformers/utils/import_utils.py", line 1384, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.training_args because of the following error (look up to see its traceback):
/usr/local/lib/python3.10/dist-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZNK3c1017SymbolicShapeMeta18init_is_contiguousEv
It's because of torch version change. nvcr pytorch 23.12 should work with flash-attn v2.4.0.post1 now.
nop, not yet working...
What is the best way to do this?
Currently I am unable to install on 23-12 or 11.
and the error: