Aider-AI / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
20.47k stars 1.89k forks source link

[Question] Get Local (Qwen) LLMs to better conform to Aider's expectations #2027

Open Mushoz opened 5 days ago

Mushoz commented 5 days ago

Issue

I was wondering if and how people are using local LLMs together with Aider for their code editing needs? I have been trying the following models:

qwen2.5-coder:7b-instruct-fp16 qwen2.5:32b-instruct-Q4_K_M qwen2.5:72b-instruct-Q4_K_M

All of them are running through Ollama, with a context length of 32k.

And every single one of them is not adhering to the instructions Aider set forth. They mainly add needless explanations instead of only outputting the code changes. I have only added a single 125 line file to the context, have a clear history (/clear) and disabled the repo map to keep the load on the context as small as possible, yet the problem persists.

Strangely enough I CAN get the LLM to corporate by adding this to the end of my own prompts: "Do not add ANY explanation of the proposed code changes. Only output the new code in the earlier mentioned format.". But adding this explanation every single time is cumbersome and not how Aider is supposed to work.

I know local models are not as strong as models such as Claude 3.5 Sonnet and Chatgpt-4o. But especially for the 72b model I was expecting it to at least be able to follow along with the instructions. I am experiencing issues both with the "diff" and "whole" edit format.

Curious to see if people have better experiences, and if so, how they managed to get it to work properly.

Version and model info

No response

WilliamStone commented 4 days ago

Strangely enough I CAN get the LLM to corporate by adding this to the end of my own prompts: "Do not add ANY explanation of the proposed code changes. Only output the new code in the earlier mentioned format.". But adding this explanation every single time is cumbersome and not how Aider is supposed to work.

I face the same problem and I found the prompt content is in wholefile_prompts.py, and you can simply modify its content as you see fit. There are also all kinds of prompt files in the directory. Hint: you can modify <path/to/your/python>/Lib\site-packages\aider\coders\wholefile_prompts.py and rerun Aider, to activate your modification immediately.