coder / aicommit

become the world's laziest committer
Creative Commons Zero v1.0 Universal
139 stars 3 forks source link

Using a local API endpoint #8

Closed seefood closed 1 month ago

seefood commented 1 month ago

using openAI or any external services is a big no-no for many companies, and there are ollama, Jan and many other tools to run a local LLM. Many of them are even compatible with the OpenAI API. I wouls appreciate if you support those.

ammario commented 1 month ago

I suppose an environment variable that overrode the API base url would satisfy your use case?

jsamuel1 commented 1 month ago

If the environment variable allowed for it to point to a product like the litellm proxy locally, that would be awesome. This would also provide a way for you to test the changes in a stable manner. https://github.com/BerriAI/litellm eg.

$ pip install 'litellm[proxy]'
$ litellm --config myawesomesecretllm.yaml
#INFO: Proxy running on http://0.0.0.0:4000
$ EXPORT AICOMMIT_URL=http://0.0.0.0:4000
...
# all API keys are in litellm's config
...
$ aicoder 
seefood commented 1 month ago

I suppose an environment variable that overrode the API base url would satisfy your use case?

sure, but that also means it needs an option not to use an API key (I.E. be ok with me leaving it blank)

For the future, the ability to hook up different API endpoints would be nice, I'll look into that LLM proxy. Thanks Josh!

poyhen commented 1 month ago

i use @openrouterteam and it would be really cool if we could pass the base url as an env variable

ammario commented 1 month ago

I will definitely implement this just not sure when.

Thank you all for the feedback.

ammario commented 1 month ago

Done in https://github.com/coder/aicommit/commit/733a37b900c9cf8e13da261414bf8b2966b26335 with caveats described in the diff