Open Zane-XY opened 2 months ago
@Zane-XY yes, endpoints will be configurable per adapter kind. I need to find the right way to do it (e.g., host/port v.s path ...)
@Zane-XY btw, feel free to explain your particular usecase. I will make sure it get covered.
In my use case, the service url and models are different, but the service is OpenAI compatible. Really appreciated the fast response!
@Zane-XY thanks. Is this aws bedrock / Google vertexai, or a custom service somewhere ? Also, is it a Ollama server? (their OpenAi compatibility layer requires some custom behaviors)
It's an enterprise hosted AI service.
Ok, that's will probably be a Custom Adapter then. I will get to it, genai should support this usecase.
I noticed that the url in this crate is hardcoded, which currently only supports the official API endpoints of each AI service provider.
Would there be a plan to make the endpoint configurable? For example, allowing users to specify Azure OpenAI endpoints through configuration would greatly enhance flexibility.