Closed seefood closed 1 month ago
I suppose an environment variable that overrode the API base url would satisfy your use case?
If the environment variable allowed for it to point to a product like the litellm proxy locally, that would be awesome. This would also provide a way for you to test the changes in a stable manner. https://github.com/BerriAI/litellm eg.
$ pip install 'litellm[proxy]'
$ litellm --config myawesomesecretllm.yaml
#INFO: Proxy running on http://0.0.0.0:4000
$ EXPORT AICOMMIT_URL=http://0.0.0.0:4000
...
# all API keys are in litellm's config
...
$ aicoder
I suppose an environment variable that overrode the API base url would satisfy your use case?
sure, but that also means it needs an option not to use an API key (I.E. be ok with me leaving it blank)
For the future, the ability to hook up different API endpoints would be nice, I'll look into that LLM proxy. Thanks Josh!
i use @openrouterteam and it would be really cool if we could pass the base url as an env variable
I will definitely implement this just not sure when.
Thank you all for the feedback.
Done in https://github.com/coder/aicommit/commit/733a37b900c9cf8e13da261414bf8b2966b26335 with caveats described in the diff
using openAI or any external services is a big no-no for many companies, and there are ollama, Jan and many other tools to run a local LLM. Many of them are even compatible with the OpenAI API. I wouls appreciate if you support those.