hiyouga / LLaMA-Factory

Efficiently Fine-Tune 100+ LLMs in WebUI (ACL 2024)
https://arxiv.org/abs/2403.13372
Apache License 2.0
30.73k stars 3.79k forks source link

Is api module support function calling? #3589

Closed Stephen-SMJ closed 4 months ago

Stephen-SMJ commented 4 months ago

Reminder

Reproduction

I use the script: llamafactory-cli api --model_name_or_path Qwen/Qwen1.5-7B-Chat --template qwen deploy many models like qwen1.5_7b_chat, Baichuan2-7B-Chat, Trelis/Meta-Llama-3-8B-Instruct-function-calling and so on. However I found It does not give a function calling result although I passed the functions. So I want to confirm is this API module supports function calling?

Expected behavior

Is it support function calling?

System Info

No response

Others

No response

codemayq commented 4 months ago

Maybe you can try this script https://github.com/hiyouga/LLaMA-Factory/blob/main/tests/test_toolcall.py for a reference.

hiyouga commented 4 months ago

This api only supports the models finetuned by llama factory on the function call dataset, it may not support the external models