Closed samayra2029 closed 3 months ago
Due to Flash Attention not supported on Mac as Cuda support is deprecated, is there way to not tie with Flash Attention?
Fixed in https://github.com/AI4Bharat/IndicTrans2/commit/450b0b6cb768ed529d6b851a7b59f0b3eb778637.
Thanks It worked. !!
Due to Flash Attention not supported on Mac as Cuda support is deprecated, is there way to not tie with Flash Attention?