RyanWangZf / MedCLIP

EMNLP'22 | MedCLIP: Contrastive Learning from Unpaired Medical Images and Texts
394 stars 41 forks source link

Inquries about the text encoder's pretrained weight #42

Open hangyulyoon opened 3 months ago

hangyulyoon commented 3 months ago

I apologize for inquiring about the code released a few years ago, but it appears that the text encoder directly loads the weights from BioClinicalBERT rather than being pretrained through contrastive learning. From my understanding, the text encoder was trained without being frozen. Are there no available pretrained weights for the text encoder of MedCLIP other than those from BioClinicalBERT?