Closed Ofir408 closed 7 months ago
Hi, we currently do not have pre-trained models, but instead, we have pre-trained embedding of medical code.
You can import those pre-trained embeddings by calling pretrained_emb="LM/clinicalbert"
for example when you initialize models. Please check this notebook for details: pre-trained embedding tutorial.
You can further train this model after importing the embedding.
Hi, How can I do training from a pre-trained model?
For example, instead of using:
I would like to use a pre-trained Transformer or pre-trained bert model instead. Is it possible? @pat-jj