-
Here is what I ran:
```python
from transformers.hf_api import HfApi
from tqdm import tqdm
import pandas as pd
model_list = HfApi().model_list()
model_ids = [x.modelId for x in model_list]
…
-
## Environment info
- `transformers` version: 4.5.1
- Platform: Linux-4.15.0-143-generic-x86_64-with-glibc2.27
- Python version: 3.9.4
- PyTorch version (GPU?): 1.7.1 (True)
- Tensorflow vers…
-
I want to train a long sequence dataset (MIDI text event representation like the one in [MuseNet](https://openai.com/blog/musenet/#dataset)) from scratch. Since, I can't split the sequence to "sentenc…
-
I am using Bert2Bert EncoderDecoderModel from Huggingface for sentence simplification. But my model is generating a zero tensor of the same length regardless of the input. Could someone help me what i…
-
## Environment info
- `transformers` version: 4.4.0.dev0
- blenderbot, bart, marian, pegasus, encoderdecoder, t5: @patrickvonplaten, @patil-suraj
```
python examples/seq2seq/run_seq2seq.py …
-
Hi! Thank you for your awesome work!
I want to perform semantic parsing. Unfortunately, I couldn't find any examples on hugging face repo for that. Could you please let me know how I should proceed…
-
Hi, the model in: https://huggingface.co/google/bert2bert_L-24_wmt_de_en doesn't work anymore. It seems that the library has changed a lot since the model was added, therefore the classes themselves s…
-
`Evaluation: 100% 30/30 [00:45= (0):
terminate called after throwing an instance of 'google::protobuf::FatalException'
what(): CHECK failed: (index) >= (0): `
I am using the following script:…
-
I have been looking to do some seq2seq tasks in the huggingface-transformers using BertGeneration or EncoderDecoderModel classes.
But I only have ended up finding some simple examples described in …
-
Hi,
I am following the instructions written on the HuggingFace website to use an encoder-decoder model:
from transformers import EncoderDecoderModel, BertTokenizer
import torch
```
tokeniz…