Open syaikhipin opened 1 month ago
Hi @syaikhipin - this would indeed be helpful.
For now, I think a combination of litellm (https://github.com/BerriAI/litellm) and llm
(https://github.com/simonw/llm) should be an effective workaround. The latter is already installed as an OntoGPT dependency.
This is the basic idea:
extra-openai-models.yaml
that llm
uses (see https://llm.datasette.io/en/stable/other-models.html). For OntoGPT, this should match the name of a model it already knows about (one of these: https://github.com/monarch-initiative/ontogpt/blob/main/src/ontogpt/models.yaml).
I'm interested in using OntoGPT with an OpenAI-compatible base URL. Currently, OntoGPT seems to be designed for use with the official OpenAI API.
Could you please consider adding support for OpenAI-compatible base URLs? This would allow users to leverage alternative providers that offer similar functionality to OpenAI.
Benefits:
Increased flexibility for users who might have access to different API keys or prefer alternative providers. Potential for cost savings if other providers offer more competitive pricing. Possible Implementation:
If feasible, could OntoGPT accept a configuration option for specifying the base URL? This would enable users to choose between the official OpenAI API and compatible alternatives.