Shivanandroy / simpleT5

simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.
MIT License
386 stars 62 forks source link

how to generate attention mask seperately? #52

Open Liujingxiu23 opened 1 year ago

Liujingxiu23 commented 1 year ago

I want to use my own input_embedings(or my own tokenizer system) instead of input_ids, but I do not know how to generate encoder_attention_mask and decoder_attention_mask bymyself, is there any code?

vishugupta96 commented 1 year ago

facing the same problem not much info is provided on the inner working of this model making customization difficult.

DamithDR commented 1 year ago

You may refer his other repository : https://github.com/Shivanandroy/T5-Finetuning-PyTorch