matrixgpt / matrix-chatgpt-bot

Talk to ChatGPT via any Matrix client!
GNU Affero General Public License v3.0
238 stars 64 forks source link

ollama support #304

Open ToeiRei opened 4 months ago

ToeiRei commented 4 months ago

As far as I know, ollama is working on OpenAI compatibility.

To my understanding, changing the host to use your local ollama instance should work... (have not tested yet)

philidinator commented 4 months ago

Hi, I'm also interested, did you try it out yet?

ToeiRei commented 4 months ago

No, I did not spoof the endpoint with /etc/hosts yet as I run a more complex setup here.

philidinator commented 4 months ago

I tried setting CHATGPT_REVERSE_PROXY=http://ip:11434/v1/chat/completions, but unfortunately it didn't work. I always get an error message: [ERROR] [OpenAI-API Error: Error: Failed to send message. HTTP 400 - {"error":{"message":"[] is too short - 'messages'","type":"invalid_request_error","param":null,"code":null}}].

Expro commented 3 months ago

Same here.