Open Zane-XY opened 4 months ago
@Zane-XY yes, endpoints will be configurable per adapter kind. I need to find the right way to do it (e.g., host/port v.s path ...)
@Zane-XY btw, feel free to explain your particular usecase. I will make sure it get covered.
In my use case, the service url and models are different, but the service is OpenAI compatible. Really appreciated the fast response!
@Zane-XY thanks. Is this aws bedrock / Google vertexai, or a custom service somewhere ? Also, is it a Ollama server? (their OpenAi compatibility layer requires some custom behaviors)
It's an enterprise hosted AI service.
Ok, that's will probably be a Custom Adapter then. I will get to it, genai should support this usecase.
👍 +1
I'm using Jan.ai, TabbyML and LM Studio to run local models with local API server exposing an OpenAI-compatible API. I would like to use this crate to make requests to them (also for embeddings) 🙂
Hi! + for this feature.
Basically, it would be better to just make API base URL to be a variable that can be changed in constructor.
That's what I did to use ollama (it was not genai
, other project).
I noticed that the url in this crate is hardcoded, which currently only supports the official API endpoints of each AI service provider.
Would there be a plan to make the endpoint configurable? For example, allowing users to specify Azure OpenAI endpoints through configuration would greatly enhance flexibility.