Thank you for you wonderful BioBert pretrained model.
As a part of my work, I would like to have BioBert model trained in my medical corpus (without labels) before use it for further text embedding.
However, all the source code I found mostly require labelled data (finetuning not further pretraining).
Could you please introduce a sample for that?
Thank you for you wonderful BioBert pretrained model. As a part of my work, I would like to have BioBert model trained in my medical corpus (without labels) before use it for further text embedding. However, all the source code I found mostly require labelled data (finetuning not further pretraining). Could you please introduce a sample for that?
Thank you very much