Georgetown-IR-Lab / cedr

Code for CEDR: Contextualized Embeddings for Document Ranking, accepted at SIGIR 2019.
MIT License
156 stars 28 forks source link

Question about training a vanilla_bert #40

Open yiyaxiaozhi opened 2 years ago

yiyaxiaozhi commented 2 years ago

Thanks your wonderful work CEDR, it helps to understand the document ranking task a lot! I tried to use this repo and followed the instruction to train a vanilla_bert with 1-fold and 2-fold data on the Robust04 dataset. However, I obtained the NDCG@20 result as: vanilla_bert_1fold :0.40704 vanilla_bert_2fold: 0.45290 it is much different from your release checkpoints in #18 : vanilla_bert_1fold: 0.42185 vanilla_bert_2fold: 0.47948

The hyperparameters are as follows: image Would you please share your setting of hyperparameters during fine-tuning the vanilla-bert ?