google-research / electra

ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
Apache License 2.0
2.31k stars 351 forks source link

ELECTRA-base fine tuned on MNLI #120

Closed ngoquanghuy99 closed 3 years ago

ngoquanghuy99 commented 3 years ago

Could you please release the ELECTRA-base fine tuned on MNLI or post it on Huggingface's Transformers?

PhilipMay commented 3 years ago

If you want to compute an semantic embedding and use cos. similarity -> maybe you want to have a look at this: https://huggingface.co/T-Systems-onsite/cross-en-de-roberta-sentence-transformer

It is very good for English and German. More languages here: https://github.com/German-NLP-Group/xlsr#models