Aider-AI / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
22.63k stars 2.11k forks source link

Feature Request: Make aider expose OpenAI Compatible API #2226

Open distributev opened 3 weeks ago

distributev commented 3 weeks ago

Issue

Aider is one one of the (if not the) best coding tools around. But beside aider there are lots of other VS Code, Intellij IDEA etc plugins which try to do the same and are not so good. People also tried to build a VS Code extension wrapper around aider. If aider would expose an OpenAI Compatible API than all this tools could start to easily integrate aider as their AI coding engine.

Also this would make integrating aider easier into existing tools which people use.

For instance https://github.com/cpacker/MemGPT offers its services using a OpenAI Compatible API and because of that I can easily integrate memgpt into my Emacs workflow even if MemGPT never developed custom code to allow this. In the same time I can take MemGTP expose it as a Slack Chat bot even if MemGPT people did not code for that.

Maybe you can build a LiteLLM aider backend to achieve that because you already use LiteLLM and you might be familiar with it.

Version and model info

No response

agokrani commented 3 weeks ago

@distributev I think they are not interested in supporting this officially. However, I am thinking of doing this refactor personally. Would you be interested in giving more details on this? I already have the idea of the entire codebase and I think it should be possible. Just want to get fine grained details.

If you are interested, please reachout over linkedin (Aman Gokrani)

distributev commented 3 weeks ago

One approach could be to expose aider through LiteLLM which is a proxy which takes different LLMs apis and exposes all with a standard Open AI api format.

You would need to put a new folder here called "aider"

https://github.com/BerriAI/litellm/tree/main/litellm/llms

and then, looking at the existing LiteLLM "adapters" (other existing folders from there) write the custom code to "adapt" the existing aider API => to expose it using OpenAI api format.

This approach would work but will make the solution dependent on LiteLLM.

Another approach could be to write a Flask/FastAPI server to expose "wrap" the existing aider API using OpenAI format specification. This approach would not require LiteLLM.

The LiteLLM approach might be easier since you write the custom aider code being guided by other existing LiteLLM adapters and you just do the same but for aider.

agokrani commented 3 weeks ago

I don't think LiteLLM approach will work, since most of these are just single api call to the model. What I am thinking is separating out the entire core of aider and then handling the IO via callback handlers. This will allow the usage not only via apis but also via python sdk and terminal.

paul-gauthier commented 2 weeks ago

Thanks for trying aider and filing this issue.

This isn't really possible as aider conversations don't follow the flow of an OpenAI chat.

distributev commented 2 weeks ago

For sure you know better what is possible and what is not.

I see aider + ChatCompletions API support being adopted as a "default" backend engine by most of the VS Code / InteliJ Idea code assistants which, on top of aider, will need to focus to add only high level features and good integration with the IDE.

The ChatCompletions interface is not much, I do not see many reasons why aider cannot be wrapped with. I chat with aider like I do with any other LLM and even the /commands can be dumped inside the ChatCompletions API.

Recently Github Copilot added support for Claude and other models and it became much better overall reducing the gap very much. For a fix 10usd/month I get Claude copilot. With aider I had periods in which I was refilling Claude with 20-25usd every 2 or 3 days.

Another area in which copilot is lacking (and most likely they are not willing to do it) it is that they do not expose an api/cli to automate copilot, they want copilot to be used through the copilot UI exclusively. Aider has a cli and a way to be scripted but last time I tried it had some limitations.

distributev commented 2 weeks ago

For instance just by exposing the current aider with ChatCompletions API it will be possible, without you doing additional coding, to use aider inside Emacs or vim - there are still lots of people which use Emacs or vim.

gptel is a llm client for emacs which can connect to any llm which exposes that API.

https://github.com/karthink/gptel

many opportunities just from exposing a standard API.

mattduck commented 1 week ago

hi! thanks for all your work on aider.

I just noticed this when I was browsing through aider issues and want to +1 -- I had a similar thought / use-case earlier this week, wishing I could use aider as a backend for Emacs' gptel library to combine the very nice UX of gptel with the project/code-awareness features of aider.

my rationale is that I find all the project-aware parts of aider useful, but it's a much more natural experience if I can interact with it from within my editor/IDE rather than being constrained to a separate shell.

One example of this: often I'm already looking at the file or snippet that I want to edit or ask a question about. The least-friction way to do this is for me to just type my message in the file directly and then send it to the LLM -- and there are various ways to do this already in Emacs. With the separate shell I have to check my file path, go to the shell, /add the file, tell aider which part of the file I'm interested in, type my message and then go back to the editor to check the edit results.

it doesn't sound like much but there are lots of interactions like this that IMO add up to a much better experience when they're integrated with the editor/IDE, and I ended up writing code for Emacs to wrap aider to try to bridge the gap -- but it's brittle and would be nicer if I could use existing LLM frontend tools for this, at least for the chat parts of aider.

I appreciate that it's not as simple as a 1:1 flow though and this might not be the design direction for the project.