Closed surtantheta closed 2 years ago
Hi, if you only want to use the pretrained models for testing, you can simply remove the --do_train --do_eval
in here and pass the pretrained model path via adding --load_model_path your/pretrained/model/path
.
Thanks a lot for your reply. Can you also provide a link to your pretrained concode model. Is it the same as that of fine-tuned model checkpoints that you have provided in your repo?
We have released all fine-tuned CodeT5-base checkpoints at here:)
Can you kindly elaborate on how we can use the fine-tuned checkpoints for the prediction of new data in concode task? Say this is my prediction data: {"code": "public integer sum(Integer arg0,Integer arg1) {return result;}", "nl": "Add two integers. concode_field_sep int sum concode_field_sep int result"} If I understand correctly then concode is supposed to complete these functions. However, I am not sure how to generate prediction on this sample data. I tried replacing the test file containing original test data with this sample test data and then ran this command
python run_exp.py --model_tag codet5_small --task concode --sub_task none
This command starts with training, then evaluating and finally testing. However, I am interested in only prediction. Isn't there any way to directly generate predictions from fine-tuned model on concode ? Kindly let me know if I am doing something wrong.