Closed Robin-Y-Ding closed 2 years ago
Hi, the released model is pre-trained with the only identifier-aware denoising for 100 epochs.
I see, thank you for your response. May I ask whether there is any plan to release the 150-epoch pertaining model? I realized there is the latest checkpoint for multi-lingual code summarization, so I wondered whether there will be an incoming model for NL2code generation?
Yes, we are planning to release another checkpoint for NL2code generation using CodeSearchNet. Pls stay tuned:)
Great to know! Thanks! Closing the issue.
Dear authors,
I noticed in the paper you mentioned that you pre-train the T5 model with identifier-aware denoising for 100 epochs and further pre-train with bimodal generation for 50 epochs. I was wondering the released model only includes the first 100 epochs or the whole 150 epochs?
Thanks in advance for your clarification