rmihaylov / falcontune

Tune any FALCON in 4-bit
Apache License 2.0
468 stars 51 forks source link

Missing compatibility with with torch 1.13 #6

Open phisad opened 1 year ago

phisad commented 1 year ago

When we run the model on our servers, then we encounter the following problem:

File "/home/users//.cache/huggingface/modules/transformers_modules/tiiuae/falcon-40b-instruct/5b9409410d251ab8e06c48078721c8e2b71fa8a1/modelling_RW.py", line 289, in forward attn_output = F.scaled_dot_product_attention( AttributeError: module 'torch.nn.functional' has no attribute 'scaled_dot_product_attention'

Are there any plans to make the model also work on a bit older versions of PyTorch?

I think this would be useful as many people are still on 1.13.