HelgeSverre / ollama-gui

A Web Interface for chatting with your local LLMs via the ollama API
https://ollama-gui.vercel.app/
MIT License
489 stars 79 forks source link

Using Ollama chat API #16

Open emsi opened 7 months ago

emsi commented 7 months ago

Hi!

There's a relatively new ollama chat API: https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-chat-completion

It works in a very similar way to openAi chat api:

curl http://localhost:11434/api/chat -d '{
  "model": "llama2",
  "messages": [
    {
      "role": "user",
      "content": "why is the sky blue?"
    }
  ]
}'

Since I've experienced a lot of problems with ollama-gui and newer models (e.g. nous-hermes2-mixtral) that look like prompt formatting issues I think that migrating to this API would solve most of them.

image

HelgeSverre commented 7 months ago

Seems like a good idea, will investigate over the weekend

emsi commented 7 months ago

Seems like a good idea, will investigate over the weekend

Any thoughts on that?

HelgeSverre commented 7 months ago

💰 Bug bounty of $30 to whomever implements this.*

(*) Paid via PayPal or whatever payment service where you can send me a link, no wire transfers.

emsi commented 7 months ago

💰 Bug bounty of $30 to whomever implements this.*

(*) Paid via PayPal or whatever payment service where you can send me a link, no wire transfers.

Ollama has implemented OpenAI compatible api interface so you can point to that:

https://ollama.ai/blog/openai-compatibility

ebanDev commented 2 months ago

Hi! If this still something you'd like to work on, I'd be eager to have the issue assigned to me :)

HelgeSverre commented 2 months ago

Did someone do this, i feel like someone did this, but I'm not certain...

SwingBling commented 2 weeks ago

I just started using Ollama GUI today, and I also see similar prompt issues when using the model deepseek-coder-v2. Example image here: image