[X] I have read the README and searched the existing issues.
Reproduction
Does the current version support CodeLLaMa for finetuning and inference? i.e. can I use the configuration for LLaMa-2 for finetuning CodeLLaMa?
(If not, will you support it in the future?)
Reminder
Reproduction
Does the current version support CodeLLaMa for finetuning and inference? i.e. can I use the configuration for LLaMa-2 for finetuning CodeLLaMa? (If not, will you support it in the future?)
Expected behavior
No response
System Info
No response
Others
No response