koursaros-ai / nboost

NBoost is a scalable, search-api-boosting platform for deploying transformer models to improve the relevance of search results on different platforms (i.e. Elasticsearch)
Apache License 2.0
674 stars 69 forks source link

train model? #35

Closed pommedeterresautee closed 4 years ago

pommedeterresautee commented 4 years ago

Would it be possible to train a model on our own data leveraging pretrained BERT models from transformerslibrary for instance? I don't see anything related to the training part (on the Pytorch part in particular)

colethienes commented 4 years ago

Hi, we don't support training on NBoost. However, you can use a custom model (after training) in NBoost by using the --model_dir to specify the binary and --model PtBertModel to specify the Pytorch BERT model class. There is also support for benchmarking in the documentation.

pommedeterresautee commented 4 years ago

Thank you for your answer. Can you provide me with a pointer to source code to train a model? Is it just classical classification and reuse of raw logit on CLS token like in Cho paper? Or something more advanced?

pertschuk commented 4 years ago

We use reuse of raw logits as in Cho paper, and model code based on huggingface's transformers library.

The training can be benefited from gradually increasing the number of positive examples to avoid overfitting (i forget the name of this technique off the top of my head..)

pommedeterresautee commented 4 years ago

Thanks. I close the issue, however I think that sharing the training part source code would be really great (even if easy to adapt). Cho code is old and TF based, TF is changing all the time, it requires an analysis to be sure to have the same hyper parameters, etc.

Sharathmk99 commented 4 years ago

@pommedeterresautee did you figure out how to train model for your own data. I'll be interested to know if you have pointers to follow. Thanks

nreimers commented 3 years ago

@pommedeterresautee @Sharathmk99

I just pushed a clean and up-to-date example how to train BERT and other transformer model on MS Marco: https://github.com/UKPLab/sentence-transformers/blob/master/examples/training/ms_marco/train_cross-encoder.py

The result is a HuggingFace Transformer model that could be used with nboost. The models I trained so far outperform the nboost models with comparable model size & run-time (they will soon be added to Hugginface models repository).