Open JohnnyStreet opened 2 weeks ago
Thank you for following the development in transformers. That function was changed in https://github.com/huggingface/transformers/commit/8a734ea2c340beee23e665601919814918bf4c43 which is not yet in a released version afaict. I think for Coqui it would be best to set an upper limit on transformers again (I'll do that in #149) and then after it has been released for some time we can adapt our code.
_prepare_attention_mask_for_generation expects a config and kwargs but is being passed tensors here (which are accessible in the config anyway)