jeremychone / rust-genai

Rust multiprovider generative AI client (Ollama, OpenAi, Anthropic, Groq, Gemini, Cohere, ...)
Apache License 2.0
160 stars 35 forks source link

OpenAI compatible API support #5

Open Zane-XY opened 2 months ago

Zane-XY commented 2 months ago

I noticed that the url in this crate is hardcoded, which currently only supports the official API endpoints of each AI service provider.

Would there be a plan to make the endpoint configurable? For example, allowing users to specify Azure OpenAI endpoints through configuration would greatly enhance flexibility.

jeremychone commented 2 months ago

@Zane-XY yes, endpoints will be configurable per adapter kind. I need to find the right way to do it (e.g., host/port v.s path ...)

jeremychone commented 2 months ago

@Zane-XY btw, feel free to explain your particular usecase. I will make sure it get covered.

Zane-XY commented 2 months ago

In my use case, the service url and models are different, but the service is OpenAI compatible. Really appreciated the fast response!

jeremychone commented 2 months ago

@Zane-XY thanks. Is this aws bedrock / Google vertexai, or a custom service somewhere ? Also, is it a Ollama server? (their OpenAi compatibility layer requires some custom behaviors)

Zane-XY commented 2 months ago

It's an enterprise hosted AI service.

jeremychone commented 2 months ago

Ok, that's will probably be a Custom Adapter then. I will get to it, genai should support this usecase.