hiyouga / LLaMA-Factory

A WebUI for Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
https://arxiv.org/abs/2403.13372
Apache License 2.0
27.97k stars 3.43k forks source link

CodeLLaMa Support #1777

Closed harrychih closed 7 months ago

harrychih commented 7 months ago

Reminder

Reproduction

Does the current version support CodeLLaMa for finetuning and inference? i.e. can I use the configuration for LLaMa-2 for finetuning CodeLLaMa? (If not, will you support it in the future?)

Expected behavior

No response

System Info

No response

Others

No response

hiyouga commented 7 months ago

Sure, the configuration is same for code llama