Closed sebastienblanc closed 1 month ago
Thanks Sebi!!
+1 !
This is an epic task, as a lot of requirement to enable it, the first one being the scope of this feature, as "Add function calling support" is very vague, and the example provided does not really cover how we want to expose this in AI Lab, some questions:
This need discussion.
Discussed with Jeff. As first steps we have to find a model with function calling support, add it on the catalog (a property will allow us to identify a model with function calling support), then create/add a recipe that leverage it.
@lstocchi Excellent, any Mistral model starting from 0.3 supports Function Calling, that could be one of the candidates ?
@sebastienblanc I guess so, we'll talk with @slemeur when he comes back from PTO
FYI Granite 20B now has a functioncalling variant. https://huggingface.co/ibm-granite/granite-20b-functioncalling
Any idea when this could be planned? This is a huge missing feature to promote Podman AI Lab vs Ollama for local development in our conference sessions and labs
For reference: https://ollama.com/blog/tool-support
Next release will push the date ASAP
This is not completed
We need to have a recipe, a compatible model in the catalog and some documentation
Is your feature request related to a problem? Please describe
Function calling is getting more and more popular and open up to some crazy use cases.
Describe the solution you'd like
In models that support function calling, the openAI API "wrapper" should return the correct json structure containing the "tools" response.
Describe alternatives you've considered
No response
Additional context
I know you've put that on the roadmap on the readme but I didn't find any issue to upvote it, so here it is :) Awesome extension btw !
Sub tasks: