nomic-ai / contrastors

Train Models Contrastively in Pytorch
Apache License 2.0
459 stars 35 forks source link

Questions about learning rate settings #38

Closed daegonYu closed 1 month ago

daegonYu commented 1 month ago

In paragraph 4.3 Unsupervised Contrastive Pretraining of the paper Nomic Embed: Training a Reproducible Long Context Text Embedder, the learning rate is set to 2e-5, contrastors/src/contrastors/configs/train In the /contrastive_pretrain.yaml file, it is set to 2e-4. What is correct?

zanussbaum commented 1 month ago

Ah thank you for pointing this out! That's a typo, it's 2e-4 for contrastive pretraining. I'll update the arxiv paper