AI4Bharat / IndicTrans2

Translation models for 22 scheduled languages of India
https://ai4bharat.iitm.ac.in/indic-trans2
MIT License
214 stars 59 forks source link

Flash Attention on Mac #69

Closed samayra2029 closed 3 months ago

samayra2029 commented 3 months ago

Due to Flash Attention not supported on Mac as Cuda support is deprecated, is there way to not tie with Flash Attention?

PranjalChitale commented 3 months ago

Fixed in https://github.com/AI4Bharat/IndicTrans2/commit/450b0b6cb768ed529d6b851a7b59f0b3eb778637.

samayra2029 commented 3 months ago

Thanks It worked. !!