Closed hanselke closed 12 months ago
Hey @hanselke, thanks for the pull request.
I was thrown at first because it was labeled as "improve commit prompt", but I see you actually submitted a few other goodies as well. I've renamed the Pull request appropriately.
As for the support for other language models - neat! I'm going to implement this more fully in a cleaner way that supports multiple different LLMs and a way to configure them. I'm also planning to have a separate requirements file for using local LLMS, as this installs quite a bit of extra stuff that I don't want to have installed for users that are just wanting to use ChatGPT. This was a huge step forward in getting it working tho, so thanks for the inspiration.
As for the Docker file - I like where you're going with this. Before accepting the pull request, I'll want a better solution than the "ugly hack" for the LLM. Any ideas there?
And lastly, the prompt improvements were spot on. I went ahead and cherry-picked that commit, and that change will go live in the next release
ah shit. sorry my bad. should have branched off to submit the PR. I didnt intent to submit the ugly hack + dockerfile here.
With regards to running locally, my suggestions in https://github.com/gorillamania/AICodeBot/issues/36
I suggest using nats as a queue mechanism, and keeping the LLM service logically separated from the main code base. Want to be able to run aicodebot on my mac without a gpu, and setup the LLM service on my linux box.
Be happy to set that up if you are happy with that approach
fixes https://github.com/gorillamania/AICodeBot/issues/57.
this sort of fixes it, however it addes a NOTE: this commit msg was generated from diffs.
this seems to fix NOTE.