-
## Environment info
- `transformers` version:
- Platform: Ubuntu
- Python version: anaconda python 3.7
- PyTorch version (GPU?):
- Tensorflow version (GPU?):
- Using GPU in script?:
…
-
# 🐛 Bug
Old versions of the adapters initialized `*adapter_attention*` which were never used but stored.
I proposed a two stage fix:
- [ ] hot fix which does not log the warning that the paramete…
-
# 🐛 Bug
No information from the package on SSL error encountered, making it difficult to troubleshoot or figure out a workaround
## Information
When trying to do:
`TFAutoModelWithLMHead.from_pre…
-
## Environment info
- `transformers` version: 4.3.0.dev0
- Platform: Linux-5.10.7-gentoo-x86_64-AMD_Ryzen_9_3950X_16-Core_Processor-with-glibc2.2.5
- Python version: 3.8.7
- PyTorch version (G…
-
- `transformers` version: 4.1.1
- Platform: Windows-10-10.0.19041-SP0
- Python version: 3.7.9
- PyTorch version (GPU?): 1.7.1+cu101 (True)
- Tensorflow version (GPU?): not installed (NA)
- Us…
-
## Context
I have used the official example for Q&A [here](https://huggingface.co/dbmdz/bert-base-italian-cased), but slightly modified the `Pipeline` to use the `model` and `tokenizer` objects from …
-
## Environment info
- `transformers` version: 4.2.2
- Platform: Manjaro Linux (Feb 2021)
- Python version: 3.8.5
- PyTorch version (GPU?): 1.7.1 (GPU)
- Tensorflow version (GPU?):
- Using G…
-
## Environment info
- `transformers` version: 4.2.1 VS 3.4.0
- Platform: Colab (K80 GPU)
- Python version: 3.6.9
- PyTorch version (GPU?): 1.7.0+cu101
- Tensorflow version (GPU?): N.A.
- Usi…
-
I am pretraining T5 and Bart.
I noticed that the padding token for ```labels``` of these models should be -100 for ```decoder_input_ids```.
I change the padding token for labels for T5(pytorch, te…
-
## Environment info
- `transformers` version: 4.2.2
- Platform: Ubuntu
- Python version: 3.7
- PyTorch version (GPU?):
- Tensorflow version (GPU?): 1.7.1 - GPU : T4
- Using GPU in script?: y…