-
## 1. The entire URL of the file you are using
https://github.com/tensorflow/models/blob/master/official/nlp/nhnet/models.py
## 2. Describe the bug
When saving a Bert2Bert model instance, I a…
-
## Environment info
- `transformers` version: 4.11.3
- Platform: Linux-5.4.104+-x86_64-with-Ubuntu-18.04-bionic
- Python version: 3.7.12
- PyTorch version (GPU?): 1.9.0+cu111 (False)
- Tensorfl…
-
Thank you for your great works!
I'm customizing Enc-Dec model, but it can't get additional parameters.
How about modifying from below,
https://github.com/huggingface/transformers/blob/0ddadbf…
-
## Environment info
- `transformers` version: 4.10.0
- Platform: Windows-10-10.0.19042-SP0
- Python version: 3.9.6
- PyTorch version (GPU?): 1.9.0+cpu (False)
- Tensorflow version (GPU?): 2.6.0 (…
-
Hi,
I would like to train an Rnd2GPT model, whose encoder is a randomly initialized transformer encoder and the decoder utilizes the pre-trained GPT2 model. I found that HuggingFace's "Encoder-Dec…
-
## Environment info
- `transformers` version: 4.12.0.dev0
- Platform: Windows-10-10.0.19042-SP0
- Python version: 3.9.5
- PyTorch version (GPU?): 1.9.0+cpu (False)
- Tensorflow version (GPU?): …
-
Hello everyone,
I try to build multiple choice QA system using Bert2Bert. I follow the model given for Swag using t5 in [https://colab.research.google.com/github/patil-suraj/exploring-T5/blob/maste…
-
Hi,
Thanks for publishing this model.
1-Is there an example of fine-tuning it for summarization?
2-How to fine-tune it on my dataset using Masked LM in pytorch?
Thanks
-
Hi,
you wrote that you used BERT2BERT structure warm-started with ParsBERT model's weights. Could you please provide a code showing how you did this?
-
Hi,
you wrote that you used BERT2BERT structure warm-started with ParsBERT model's weights. Could you please provide a code showing how you did this?