janhq / cortex.cpp

Run and customize Local LLMs.
https://cortex.so
Apache License 2.0
1.92k stars 107 forks source link

feat: Support Function Calling in llama3.1 #295

Open ChristianWeyer opened 9 months ago

ChristianWeyer commented 9 months ago

Goal

Original post

Problem AFAICS, the current implementation does not have OpenAI Function Calling support. This would be a fantastic, powerful, and much needed feature.

Success Criteria Any OAI client can be used with Nitro, even (and especially) those that use OAI Function Calling.

Reference: https://platform.openai.com/docs/guides/function-calling https://platform.openai.com/docs/api-reference/chat/create#chat-create-tools

nidhoggr-nil commented 4 months ago

Would be great to have this for the tensorRT backend as well, though I do not know if they support it.

dan-homebrew commented 1 week ago

I am scoping this to llama3.1 function calling, as per discussion here https://github.com/janhq/cortex.cpp/issues/1151#issuecomment-2339558655