HSLCY / GlossBERT

GlossBERT: BERT for Word Sense Disambiguation with Gloss Knowledge (EMNLP 2019)
https://arxiv.org/pdf/1908.07245.pdf
MIT License
92 stars 18 forks source link

Pretrained Models #2

Closed mbevila closed 4 years ago

mbevila commented 4 years ago

Hello,

are there plans to release pretrained models? The model is very expensive to train, and the research community would really benefit from having them available.

Thanks in advance.

HSLCY commented 4 years ago

Hello, I've received your email.

The model was lost because the server was formatted and I did't make a backup. (sad) But I'm training a new model with the same hyperparameters for you.

And I'm very sorry that the new model might be slightly different (but comparable) from the original one because the old GPUs have been upgraded to new ones whose type is different (from Tesla V100-PCIE to Tesla V100-SXM2).

It takes 30+ hours to train and I will write to you after it finishes.

Thanks for your waiting.