Open zimenglan-sysu-512 opened 2 weeks ago
Hi @zimenglan-sysu-512,
Thank you for your interest in our work. Which transformers version are you using? Please make sure that you are using transformers==4.41.0
. Please let me know if it fixes the issue.
Get it. and need to update accelerate to 0.27.0. but still meet some problem: [rank6]: RuntimeError: FlashAttention only supports Ampere GPUs or newer.
hi @mmaaz60 can u show me how to close the flash attention?
thanks.
when run the script, met the problem: ImportError: cannot import name 'Phi3Model' from 'transformers'