fmaclen / hollama

A minimal web-UI for talking to Ollama servers
https://hollama.fernando.is
MIT License
309 stars 27 forks source link

Support OpenAI API format #94

Open anagri opened 1 month ago

anagri commented 1 month ago

Support OpenAI API format by giving option to switch between Ollama proprietary API format and OpenAI API format.

To fetch list of models - https://platform.openai.com/docs/api-reference/models/list?lang=node.js

To generate chat - https://platform.openai.com/docs/api-reference/chat?lang=node.js

Will help servers following OpenAI API format be used with Hollama. E.g. Bodhi App - https://github.com/BodhiSearch/BodhiApp

fmaclen commented 1 month ago

I'm not sure entirely sure I understand this request, or what the use-case would be.

It sounds like you want Hollama to expose an API using the OpenAI schema so other services can interact with Ollama using Hollama as a proxy of sorts? is that correct?

binarynoise commented 1 month ago

I understand that he rather wants to connect to OpenAI-API-compatible servers that do not use the ollama-API.

fmaclen commented 1 month ago

@binarynoise What are "OpenAI-API-compatible servers"?

Is the suggested feature for adding a way to set your own OpenAI API key and be able to choose to get a completion from Ollama or OpenAI's API?

binarynoise commented 1 month ago

What are "OpenAI-API-compatible servers"?

Bodhi App seems to be one:

It also exposes these LLM inference capabilities as OpenAI API compatible REST APIs.

@anagri would have to clarify the rest.

anagri commented 1 month ago

tx @binarynoise

@fmaclen - many of the apps that run LLMs locally prefer exposing APIs using OpenAI API format. These includes jan.ai, lm-studio, bodhi app etc.

so if hollama can start supporting OpenAI APIs, we can use hollama UI to connect to the APIs exposed by these apps. and if someone wants to use their OpenAI API key, they can use that as well.

tldr; - OpenAI APIs are more of industry standard than proprietary Ollama APIs. Hollama get can a wider audience and adoptability if they start supporting OpenAI APIs.

fmaclen commented 1 month ago

@binarynoise @anagri thanks for the clarification, I understand what you mean now.

fmaclen commented 1 month ago

To expand a bit, I'm not opposed to the idea of allowing multiple completion servers, this is however not a trivial implementation and we have other priorities at the moment.

If someone wants to give this a shot I'd start by using ollama.ts as the base template to handle the logic of the other servers.

There are also a few factors to consider for this implementation: