Open barseghyanartur opened 2 days ago
Many of the models ollama supports seem to also be supported by Groq, see https://ai.pydantic.dev/api/models/groq/.
Can you explain why need you need ollama as well as groq?
Ollama is so you don't have to rely on vendors and can keep things local. So Ollama supports use-cases that require more security and locality.
Agree with @abtawfik + ollama
is ideal for development. Keeping the costs low, less mocking, cheap testing.
I think this needed due to privacy/data security purposes, not model availability. Lots of corporate environments completely block access to external AI/LLM services due to infosec concerns.
Yes, this is unusable for us without support for totally private & local usage via ollama.
My company cannot use any network call to ai for analysis of code written in house because of security.
👍 @samuelcolvin, as others have stated, Ollama helps you run LLMs locally/privately. Please add support for it.
Definite demand for this, here is the 2nd highest ranked post in r/localllama in the past 30 days
It's exactly what this project is doing ATM
Ollama supports the OpenAI SDK fwiw
Okay, happy to support it, especially if we can reuse some of the openai integration.
PR welcome
Please! ;)