beir-cellar / beir

A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
http://beir.ai
Apache License 2.0
1.62k stars 192 forks source link

Training script/configuration of T5 generator for GenQ #141

Open jihyukkim-nlp opened 1 year ago

jihyukkim-nlp commented 1 year ago

Hello, thank you for sharing this repo and useful hugging face model cards!

I am interested in T5 generators for query generation, and trying to extend this to other datasets/tasks. For doing so, I would like to reproduce T5 generators, specifically BeIR/query-gen-msmarco-t5-large-v1.

I am wondering if the training script and training configurations for the generators can be shared, including

Best regards, Jihyuk Kim

thakur-nandan commented 1 year ago

HI @jihyukkim-nlp,

Sadly, I do not remember these exact training details for training the question generator models. @nreimers could you help me here?

Regarding the first two points:

jihyukkim-nlp commented 1 year ago

Hi @thakur-nandan,

Thank you for the information! I have tried a few different configurations, e.g., different learning rates (1e-5, 3e-5, 5e-5, 1e-4) either with or without warmup steps. But I have failed to reproduce.

It worked relatively well for MS MARCO, but not for BEIR. At this point, I am also wondering if other datasets, such as NQ, were used for training the generator by any chance?

It will be really helpful if @nreimers can help.