-
Hi there,
I trained a MT5ForConditionalGeneration model. During training, I used my own embeddings for encoding (but default embeddings for decoding). However, when I try to generate output using g…
-
## Environment data
- Language Server version: 2021.5.2-pre.1
- OS and version: Windows
- Python version (& distribution if applicable, e.g. Anaconda): Anaconda
## Actual behavio…
-
tokenizer = T5Tokenizer.from_pretrained("imxly/t5-pegasus")#google/mt5-base
OSError: Can't load tokenizer for 'imxly/t5-pegasus'. Make sure that:
- 'imxly/t5-pegasus' is a correct model identifi…
-
**Schedule**
- First development release: 16 March 2021
- Beta release: 23 March 2021
- Production release: 30 March 2021
Docs: [https://pythainlp.github.io/docs/2.3/index.html](https://pythainl…
-
## Environment info
- `transformers` version: 4.3.2
- Platform: linux
- Python version: 3.7
- PyTorch version (GPU?): 1.7
- Tensorflow version (GPU?): -
- Using GPU in script?: yes
- Usi…
-
一、finetune存储的模型如下:
t5.save_weights_as_checkpoint("best_model_mt5/best_model.ckpt" )
二、infer时,加载模型如下:
fine_tune_checkpoint_path = 'best_model_mt5/best_model.ckpt'
#加载分词器
tokenizer = SpTokenizer(…
-
## Environment info
- `transformers` version: 4.4.2
- Platform:
- Python version: 3.7
- PyTorch version (GPU?): 1.8
- Tensorflow version (GPU?):
- Using GPU in script?: -
- Using distribu…
-
Thanks for the nice package!
From the readme, I see that you have tried T5. Wondering if you have tried mT5 at all? If so, would be great to mention it in the readme (and the spreadsheet of the sc…
-
## Environment info
- `transformers` version: 4.9.0.dev0
- Platform: Linux-5.4.0-1043-gcp-x86_64-with-glibc2.29
- Python version: 3.8.10
- Flax version (CPU?/GPU?/TPU?): 0.3.4 (tpu)
- Jax version…
-
Hi.
About half a year ago, when I fine-tuned the original japanise summarization task with multilingual-t5, it worked fine.
But, Same task does not work. :_(
library version and error is as follo…