Closed OsamaAlRashed closed 1 month ago
@OsamaAlRashed thanks for your input, thats a very good question.
Basically function calling is a Model Capability where the given model was fine tuned for, currently I don't have any knowledge of a Local model that does support function calling yet.
That said, we have 3 different connectors that supports Auto Function Invocation with Function CAlling, MistralAI, Gemini and OpenAI.
With the new Function CAlling abstractions we will have a clear path to set our other connectors to support function tool calling (OpenAI Standard) but they will eventually fail if running against a model that is not fine tuned for this capability
@RogerBarreto, thanks for your reply.
I have a question for clarification. Is the auto-invocation of kernel functions handled by the model itself or by the package? My understanding is that the model as a black box, where the input is text and the output is text. I assumed that auto-invocation is managed in the code after receiving the model's reply.
If my understanding is correct, shouldn't the AutoInvoke feature work regardless of the model type?
I have a question for clarification. Is the auto-invocation of kernel functions handled by the model itself or by the package? My understanding is that the model as a black box, where the input is text and the output is text. I assumed that auto-invocation is managed in the code after receiving the model's reply.
The SK library looks in the Kernel to see what plugins are available, and includes a list of all of the functions in the request to the model. If the model decides to request an invocation of a function, it sends back a reply that makes the request, detailing what function to invoke and with what arguments. The SK library decodes that request, finds the relevant function, invokes it with the deserialized arguments, and then constructs a new request to the model that includes the result of the invoked function. That new request is then sent back to the model. At which point the model might choose to request another function or not.
Basically function calling is a Model Capability where the given model was fine tuned for, currently I don't have any knowledge of a Local model that does support function calling yet.
e.g. https://ollama.com/library/mistral:7b. We really need a local story for function calling. Local is a very important scenario, and without planners, function calling is the only answer.
I am currently learning and understanding LLMs and SK with LocalAI. So this is a much needed capability from my end as well. Function calling does not seem to be working at all - even without the automatic invocation.
This issue is stale because it has been open for 90 days with no activity.
This issue was closed because it has been inactive for 14 days since being marked as stale.
I have a local model (gguf) that I'm using to build a chatbot. The chatbot should be able to answer questions about my application and allow users to add entries to the database. To achieve this, I have defined a set of kernel functions:
Current Approach:
Since there is no option like AutoInvokeKernelFunctions for the local model, I handle the functionality manually by providing the bot with a predefined template in the history:
I then manually invoke the kernel functions as follows:
Problem:
Manually handling the invocation of kernel functions is cumbersome and makes it difficult to support all possible cases.
Request:
Is there a plan to support the AutoInvokeKernelFunctions feature for local models, regardless of the model being used?