Open JustinZou1 opened 11 months ago
Hi, this might be due to that the modeling class is stored in a remote Hugging Face repo (see here). You can download the modeling_codet5p.py and configuration_codet5p.py to your local environment and then import the model class from it to load your finetuned checkpoint (without trust_remote_code=True
).
I have completed to do the instracution tuning with code_alpaca_20k.json.
And the final model is in the folder of "/home/ubuntu/ChatGPT/CodeGen/CodeT5/CodeT5+/output/instruct_codet5p_6b/final_checkpoint", I tried to do inference, there have following issue:
And this is the inference code: (codegen) ubuntu@chatbot-a10:~/ChatGPT/CodeGen/CodeT5/CodeT5+$ cat cli1.py