UKPLab / sentence-transformers

Multilingual Sentence & Image Embeddings with BERT
https://www.SBERT.net
Apache License 2.0
14.37k stars 2.39k forks source link

How to pretrain mlm and contrastiveLoss at the sametime? #1456

Open ZzyChris97 opened 2 years ago

ZzyChris97 commented 2 years ago

Can i use training_multi-task for pretraining model using mlm task and contrastive loss at the same time? My data are all sentences pair.

Looking forward to your reply!

nreimers commented 2 years ago

This is currently not possible. In the experiments of @kwang2049 it also performed quite bad, as MLM and constrative loss optimize for different objectives.