-
### Feature request
Adding support for M2M100 would allow running faster translation on GPUs thanks to the optimizations in the BetterTransformers module.
It would also benefit the decoder in addi…
-
### News
- Conferences
- ECCV 2022 결과 발표: Accept되신 분들 모두 축하드립니다.
- NAACL 2022 7월 10 ~ 15일 Seattle까지
- [사상 검열에 AI까지 동원…발톱 드러낸 중국 디지털 전체주의](https://www.hankookilbo.com/News/Read/A202207…
-
NusaCatalogue: https://indonlp.github.io/nusa-catalogue/card.html?kopi_nllb
| Dataset | kopi_nllb |
|-------------|---|
| Description | KopI(Korpus Perayapan Indonesia)-NLLB, i…
-
NusaCatalogue: https://indonlp.github.io/nusa-catalogue/card.html?nllb_seed
| Dataset | nllb_seed |
|-------------|---|
| Description | NLLB Seed is a set of professionally-tra…
-
### Feature request
When I use modle 'facebook/nllb-200-distilled-600M' to translate , it consuming 0.5 second. And it look like not async。
I want make it consuming 0.1 second ,and make it async。 Is…
-
### Issue URL (Annoyance)
[https://nllb.metademolab.com/](https://adguardteam.github.io/AnonymousRedirect/redirect.html?url=https%3A%2F%2Fnllb.metademolab.com%2F)
### Comment
Username: @SeriousH…
-
NusaCatalogue: https://indonlp.github.io/nusa-catalogue/card.html?kopi_nllb
| Dataset | kopi_nllb |
|-------------|---|
| Description | KopI(Korpus Perayapan Indonesia)-NLLB, i…
-
### Feature request
Add Flax/Jax support for M2M100 so it can be optimize in TPU
### Motivation
NLLB is great model translation that support many languages and also have good accuracy
it can be…
acul3 updated
2 years ago
-
### System Info
- `transformers` version: 4.22.2
- Platform: Linux-5.10.133+-x86_64-with-Ubuntu-18.04-bionic
- Python version: 3.7.14
- Huggingface_hub version: 0.10.0
- PyTorch version (GPU?): 1…
-
https://github.com/facebookresearch/fairseq/tree/nllb
- FLORES-200 covering 20 of the Eight schedule languages + some more
- LID
- LASER3 encoder
- Toxicity List
- NLLB translation models