containers / podman-desktop-extension-ai-lab

Work with LLMs on a local environment using containers
https://podman-desktop.io/extensions/ai-lab
Apache License 2.0
170 stars 31 forks source link

ollama serve/proxy support #1787

Open maxandersen opened 4 days ago

maxandersen commented 4 days ago

Is your feature request related to a problem? Please describe

i see latest nightly has pull and list available like ollama - awesome.

allows me to use ollama list/pull.

any chance to trigger what corresponds to a ollama serve and have /api/generate and/or /api/embeddings work as a proxy so users nor apps don't need to lookup the random generated port number for the running service?

Describe the solution you'd like

have way to run models from api or command line and have a stable host/port to connect with a openai or similar genai serving api relaying to the underlying started container.

Describe alternatives you've considered

No response

Additional context

No response

jeffmaury commented 4 days ago

Are you requesting a new api verb like https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion ?