-
We should add an `XLMRobertaClassifier` based on the `BertClassifier` model. This will wrap the `XLMRoberta` model with a classification head.
-
We had a few larger changes to the bert API:
https://github.com/keras-team/keras-nlp/pull/387
https://github.com/keras-team/keras-nlp/pull/390
As well as some minor cleanups:
https://github.com/…
-
### System Info
issue is platform-independent
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An …
-
## Environment info
- `adapter-transformers` version: 2.0.0
- Platform: Linux
- Python version: 3.7.7
- PyTorch version (GPU?): GPU
## Details
I can find a suitable learning rate on t…
-
### System Info
```shell
optimum==1.4.0
transformers==4.22.1
onnx==1.12.0
onnxruntime==1.12.1
Python 3.9.13
```
### Who can help?
@lewtun
@michaelbenayoun
### Information
…
-
### System Info
Transformers 4.21.0
### Who can help?
@LysandreJik @sgugger
### Information
- [ ] The official example scripts
- [X] My own modified scripts
### Tasks
- [X] An officially suppo…
-
### System Info
- transformers version: 4.18.1
- Platform: Linux Jupyter Notebook, TF2.3 Python 3.6, 2 GPU
- Python version: '1.7.1+cu101'
- Using GPU in script?: yes
- Using distributed or paral…
-
I have trained an adapter and saved it to my hard drive.
When I want to load it, I use the following code:
```
adapter_name = model.model.roberta.load_adapter(adapter_path, set_active=True)
prin…
-
Thanks for the great Repo.
In the mentioned repo https://github.com/hyunwoongko/asian-bart
He is doing Mbart embedding layer pruning. I want to do the same for a particular language.
Any suggesti…
-
I am very interested in multilingual embedding models. But there is no converted multilingual model. According to comments of the example sentence_embeddings_local, I converted many models successfull…