-
## Introduction
To ensure a more straight forward approach to studies we've elected to make a change that would push iterations, questions, themes, etc. into sub-studies. This way it becomes simpler…
-
Same with the title. When using run_summarization.py, how to run transformer models like t5-small, facebook/bart-large-cnn without loading pre-trained weights? I only want to train their original mod…
-
-`transformers` version: 4.5.0
- Platform: linux
- Python version: 3.8
- PyTorch version (GPU?): 1.7.1
- Tensorflow version (GPU?):
- Using GPU in script?: yes
- Using distributed or paralle…
-
Hello~
What is the difference between tdat and sdat?
could the Readme be written more complete and how to use your model on other data sets?
-
**Please be sure to add the Anthology ID in the title**
[See here](https://www.aclweb.org/anthology/info/corrections/) to read about the three types of corrections.
## Metadata correction: pleas…
-
Hello,
I tried the example code in the official website as below.
# code
`from transformers import EncoderDecoderModel, BertTokenizer
import torch
tokenizer = BertTokenizer.from_pretrained('ber…
C7ABT updated
3 years ago
-
Hi! I fine-tuned the bart model on XSum (both training and validation are fine). However, the OOM appeared during the prediction using the same machine. @patrickvonplaten @patil-suraj Here is my code…
-
I'm getting stuck on fine-tuning BART model on reddit-tifu dataset. When I use a pre-trained model of BART, for example, `bart-large-xsum` without finetuning, it works fine and produces sort of sensib…
-
## Environment info
- `transformers` version: 4.4.0.dev0
- Platform: Linux-5.3.0-53-generic-x86_64-with-glibc2.10
- Python version: 3.8.3
- PyTorch version (GPU?): 1.7.1 (True)
- Tensorflow v…
-
Actually, this FR contains several requests:
1. Bake description file should be self-contained. We should be able to build images, which are defined in its description. (currently, we could build onl…