Open ChristianWeyer opened 9 months ago
Would be great to have this for the tensorRT backend as well, though I do not know if they support it.
I am scoping this to llama3.1 function calling, as per discussion here https://github.com/janhq/cortex.cpp/issues/1151#issuecomment-2339558655
Goal
Original post
Problem AFAICS, the current implementation does not have OpenAI Function Calling support. This would be a fantastic, powerful, and much needed feature.
Success Criteria Any OAI client can be used with Nitro, even (and especially) those that use OAI Function Calling.
Reference: https://platform.openai.com/docs/guides/function-calling https://platform.openai.com/docs/api-reference/chat/create#chat-create-tools