qcwthu / Continual_Fewshot_Relation_Learning

MIT License
17 stars 0 forks source link

Question about experiment for BERT #9

Closed dabao12 closed 1 year ago

dabao12 commented 1 year ago

Thanks for your excellent work about incremental few-shot relation learning. It is really interesting and insightful. I tried to experiment with BERT, but the accuracy is far from the results in the paper. Are you welling to open how to run CFRE based on BERT? Whether I can get this part of the code. My email address is hanbj890@nenu.edu.cn. Hope to hear from you!

dabao12 commented 1 year ago

Thank you very much for your reply, by debug your code, I have a question, I find only fine-tune the 12th encoding layer of Bert and not fine-tune bert.pooler, but the paper mentioned " we only fine-tune the 12th encoding layer and the extra linear layer. "

qcwthu commented 1 year ago

Hi, sorry for the late reply. I just checked the code. "unfreeze_layers = ['layer.11', 'bert.pooler.', 'out.']" I think the final linear layer is included.