I am facing some problems when trying to train the PLBART .
The first problem is: I notice you set the args.model_name_or_path equals to "../../checkpoint_11_100000.pt" both in trainning the PLBART and in attck, but i could not find this file.
The second problem is that the config declaration " config = argparse.Namespace(activation_fn='gelu', adam_betas='(0.9, 0.98)', adam_eps=1e-08, .......)" statement in run.py . i can not understand some parameters here just like 'data='/home/zzr/CodeStudy/Defect-detection/plbart/processed/data-bin''. Are these parameters required during fine-tuning?
I notice in the same statement, you set restore_file='/data2/cg/CodeStudy/PLBART/pretrain/checkpoint_11_100000.pt', Should I run this program first to get “checkpoint_11_100000.pt”?
Thank you very much, I am eager to get your answer!
For PLBART training, we completely reused the code from the ISSTA 22 work. You can refer to: https://github.com/ZZR0/ISSTA22-CodeStudy. I believe these parameters do not affect the model training.
Please use the file downloaded in the first point.
Thanks for your interesting work.
I am facing some problems when trying to train the PLBART .