Closed youngwook06 closed 3 years ago
Hi @youngwook06,
I'm not sure if I understand the difference between 1 and 2, but yes, the versions that are not finetuned set BERT's parameters to be non-trainable. This means that only the parameters of PACRR/KNRM/DRMM heads are updated.
Does this answer your question?
Yes, I got it! I appreciate your response! :)
I am wondering about detailed setting for the model(PACRR, KNRM, DRMM) with bert which is not finetuned at table 1 in https://arxiv.org/pdf/1904.07094.pdf.
1) In those setting, is the parameters for bert trainable or not? 2) If it is not trainable, will it be the same setting if I just make bert parameters non-trainable?
Thanks for your great work!