ncbi-nlp / bluebert

BlueBERT, pre-trained on PubMed abstracts and clinical notes (MIMIC-III).
https://arxiv.org/abs/1906.05474
Other
558 stars 78 forks source link

pretrain NCBI BERT based on new released WWM BERT model #4

Closed bugface closed 5 years ago

bugface commented 5 years ago

Google recently released two new BERT models with Whole Word Masking strategy (BERT-Large(Base), Uncased (Whole Word Masking)). Do you have a plan to pre-train new NCBI models based on this new release?

yfpeng commented 5 years ago

We don't plan to train more models recently.