pydantic / pydantic-ai

Agent Framework / shim to use Pydantic with LLMs
https://ai.pydantic.dev
MIT License
1.35k stars 61 forks source link

Support `ollama` #112

Open barseghyanartur opened 2 days ago

barseghyanartur commented 2 days ago

Please! ;)

samuelcolvin commented 1 day ago

Many of the models ollama supports seem to also be supported by Groq, see https://ai.pydantic.dev/api/models/groq/.

Can you explain why need you need ollama as well as groq?

abtawfik commented 1 day ago

Ollama is so you don't have to rely on vendors and can keep things local. So Ollama supports use-cases that require more security and locality.

barseghyanartur commented 1 day ago

Agree with @abtawfik + ollama is ideal for development. Keeping the costs low, less mocking, cheap testing.

gusutabopb commented 1 day ago

I think this needed due to privacy/data security purposes, not model availability. Lots of corporate environments completely block access to external AI/LLM services due to infosec concerns.

Lambda-Logan commented 20 hours ago

Yes, this is unusable for us without support for totally private & local usage via ollama.

My company cannot use any network call to ai for analysis of code written in house because of security.

curiousily commented 20 hours ago

👍 @samuelcolvin, as others have stated, Ollama helps you run LLMs locally/privately. Please add support for it.

Lambda-Logan commented 20 hours ago

Definite demand for this, here is the 2nd highest ranked post in r/localllama in the past 30 days

It's exactly what this project is doing ATM

https://www.reddit.com/r/LocalLLaMA/s/FHbdjdo7J3

RDT_20241203_1720565923725361787596228.jpg

arcaputo3 commented 14 hours ago

Ollama supports the OpenAI SDK fwiw

samuelcolvin commented 4 hours ago

Okay, happy to support it, especially if we can reuse some of the openai integration.

PR welcome