Open SMUsamaShah opened 9 months ago
Lumos doesn't support function calling at the moment, but I'm curious to learn more about the use case. @SMUsamaShah can you share a specific use case with the tool/function that should be called?
LangChain has an experimental OllamaFunctions
chat model: https://js.langchain.com/docs/integrations/chat/ollama_functions
The most commonly cited example is asking about weather, when you do, it returns appropriate args for a weather function which calls an api and tells weather.
You can ask to perform a math calculation, and a function (can be a simple JS Eval) will solve it. Ask to draw image, a local/remote image generator can be called.
Better use for me is, you ask LLM to Google something, it will give back a function call which will fetch search results all in your browser, go through each relevant page via LLM and give you what you need.
Basically ChatGPT plugins but running locally in your browser
Can it do function calling? It will automate so much stuff if it can. Please close this if it already can do that.
I have played with function calling on ChatGPT and tried to make a local ChatGPT based tool. But I can't just let ChatGPT go to pages and do research for me (will be too expensive).
With function calling Lumos will be able to answer any question sending request to appropriate tool