containers / podman-desktop-extension-ai-lab

Work with LLMs on a local environment using containers
https://podman-desktop.io/extensions/ai-lab
Apache License 2.0
181 stars 36 forks source link

Add function calling support #1116

Closed sebastienblanc closed 1 month ago

sebastienblanc commented 5 months ago

Is your feature request related to a problem? Please describe

Function calling is getting more and more popular and open up to some crazy use cases.

Describe the solution you'd like

In models that support function calling, the openAI API "wrapper" should return the correct json structure containing the "tools" response.

Describe alternatives you've considered

No response

Additional context

I know you've put that on the roadmap on the readme but I didn't find any issue to upvote it, so here it is :) Awesome extension btw !

Sub tasks:

slemeur commented 5 months ago

Thanks Sebi!!

kdubois commented 5 months ago

+1 !

jeffmaury commented 5 months ago

See https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb for an example

axel7083 commented 5 months ago

This is an epic task, as a lot of requirement to enable it, the first one being the scope of this feature, as "Add function calling support" is very vague, and the example provided does not really cover how we want to expose this in AI Lab, some questions:

This need discussion.

lstocchi commented 5 months ago

Discussed with Jeff. As first steps we have to find a model with function calling support, add it on the catalog (a property will allow us to identify a model with function calling support), then create/add a recipe that leverage it.

sebastienblanc commented 5 months ago

@lstocchi Excellent, any Mistral model starting from 0.3 supports Function Calling, that could be one of the candidates ?

lstocchi commented 5 months ago

@sebastienblanc I guess so, we'll talk with @slemeur when he comes back from PTO

jamesfalkner commented 4 months ago

FYI Granite 20B now has a functioncalling variant. https://huggingface.co/ibm-granite/granite-20b-functioncalling

kdubois commented 2 months ago

Any idea when this could be planned? This is a huge missing feature to promote Podman AI Lab vs Ollama for local development in our conference sessions and labs

kdubois commented 2 months ago

For reference: https://ollama.com/blog/tool-support

jeffmaury commented 2 months ago

Next release will push the date ASAP

axel7083 commented 1 month ago

This is not completed

axel7083 commented 1 month ago

We need to have a recipe, a compatible model in the catalog and some documentation