baptisteArno / typebot.io

💬 Typebot is a powerful chatbot builder that you can self-host.
https://typebot.io
Other
7.68k stars 2.14k forks source link

Add Ollama block #1406

Closed Lanhild closed 3 months ago

Lanhild commented 8 months ago

APIs like Ollama are fully compatible with the OpenAI spec, thus we should have the capabability of adding a custom endpoint in OpenAI block credentials image

baptisteArno commented 8 months ago

Ollama is local. I think that does not really make sense for a tool like Typebot which is 100% online?

What would be the use case?

Lanhild commented 8 months ago

Ollama is also deployed on servers, and having this node compatible with other OpenAI based APIs would allow to use them in Typebot.

There would simply need to add an option to specify the endpoint in the block credentials.

ZimaBlueee commented 4 months ago

@baptisteArno Hi

Local LLMs are really important and are also a trend.

Please also add xinference, which supports more models than ollama and has more comprehensive functions.

Official documentation: https://inference.readthedocs.io/en/latest/index.html

baptisteArno commented 3 months ago

I don't think it is a good idea to offer the Ollama block since most people would like to consume it locally. Typebot by default is 100% online so it won't work with local ollama instances. Closing it for now.

easaw commented 3 months ago

Hi has this been deployed or is it still not planned?

A use case is that an organisation can use local AI to allow (sales) staff to ask queries about services and staff documentation which is complex, and is updated frequently. But the data must be internal only.

I can't find another way to do this without using extensive cloud resources (and yes, I am aware of Botpress, langflow, localAI, lobechat, anythingLLM, LMStudio, ActivePieces/ n8n, AsktheDoc, Docuchat, ChatDoc, Documind and PrivateGPT etc...)

easaw commented 3 months ago

@Lanhild I did think of one way to do this, not the best, but;

Change the hosts file on the machine running Typebot so that api.openai.com (or whatever) uses the IP for your Ollama instead.

Assuming all users are local, they could chat and Typebot would (should) communicate with Ollama over the local IP.

rizwan95 commented 2 weeks ago

typebot must support ollama. It is the need of the hour.