meta-llama / llama-recipes

Scripts for fine-tuning Meta Llama3 with composable FSDP & PEFT methods to cover single/multi-node GPUs. Supports default & custom datasets for applications such as summarization and Q&A. Supporting a number of candid inference solutions such as HF TGI, VLLM for local or cloud deployment. Demo apps to showcase Meta Llama3 for WhatsApp & Messenger.
11.52k stars 1.63k forks source link

How to test my finetuned model #535

Closed Tizzzzy closed 2 weeks ago

Tizzzzy commented 3 months ago

🚀 The feature, motivation and pitch

I am new to llama-recipes. Right now I have finetuned a llama3 model based on "openbookqa" dataset. It store a model for me in this path: /research/cbim/medical/lh599/research/ruijiang/Dong/llama-recipes/PATH/to/save/PEFT/model. In this model folder, there are three files: adapter_config.json, adapter_model.safetensors, README.md.

My question is how can I test this fine tuned model. For example, I can pass a question like: "The sun is responsible for?". And my model will give me an answer.

Alternatives

No response

Additional context

No response

wukaixingxp commented 3 months ago

Hi! A interactive way is to do local inference by python recipes/inference/local_inference/inference.py --model_name meta-llama/Meta-Llama-3-8B-Instruct --peft_model [PEFT_MODEL_FOLDER]. For more details please check out local inference recipe readme or model servers readme

LyuJiayi654 commented 2 months ago

Hello, I have noticed that there is currently no ’local_inference‘ folder. How can I infer this

Hi! A interactive way is to do local inference by . For more details please check out local inference recipe readme or model servers readmepython recipes/inference/local_inference/inference.py --model_name meta-llama/Meta-Llama-3-8B-Instruct --peft_model [PEFT_MODEL_FOLDER]

wukaixingxp commented 2 months ago

We just had a update on our folder structures, now the it has been change to recipes/quickstart/inference/local_inference/inference.py. Sorry for this trouble.

LyuJiayi654 commented 1 month ago

yeah, I get. Thanks.