kartikm7 / llocal

Aiming to provide a seamless and privacy driven chatting experience with open-sourced technologies(Ollama), particularly open sourced LLM's(eg. Llama3, Phi-3, Mistral). Focused on ease of use. Available on both Windows and Mac.
https://www.llocal.in
MIT License
100 stars 14 forks source link

Native ollama tool calling #11

Open minzdrav opened 4 months ago

minzdrav commented 4 months ago

Hi @kartikm7 https://ollama.com/blog/tool-support It can be used for web search, web browsing, file parsing, etc.

kartikm7 commented 4 months ago

Hello @minzdrav ! It's still not sure, how we should be traversing to implement tool calling. Personally, I feel the extra-api call increases the time for most. So rather having toggle switching, through UI makes for easier traversing for tools. But then again, when number for tools increase there's just so many toggles we can keep and another big benefit would be community implementations. That would be pivotal, just like Open WebUI provides.

With LLocal, we've already implemented web-search and will soon be implementing RAG w/ readable formats (starting with pdf). There are plans to implement, a knowledge base section with major tools provided out of the box with LLocal. Also, a philosophy behind building LLocal is that whenever I've implemented a feature I want to leave enough room for future improvement. Something that's slightly more flexible when it comes to updating but cohesive with the rest of the architecture. So best believe, it will be implemented if and when the need arises.

Although, looking into tool calling is surely in the prospects.

Thanks for the suggestion man!