lee88688 / aider-composer

Aider's VSCode extension, seamlessly integrated into VSCode
https://marketplace.visualstudio.com/items?itemName=lee2py.aider-composer
Apache License 2.0
147 stars 17 forks source link

Looking for Recommendations on how to Debug #23

Open jjafuller opened 4 days ago

jjafuller commented 4 days ago

I am wondering where I should get started trying to debug aider composer. I am attempting to use Ollama. From what I can gather it can send messages to the facade server, but there does not appear to be a call actually being made to the Ollama backend. I can see some log entries in the extension output, but they seem pretty high level.

2024-11-22 14:20:42.713 [info] Extension "aider-composer" is now active!
2024-11-22 14:20:42.713 [info] Starting aider-chat service...
2024-11-22 14:20:42.713 [info] aider-chat process args: C:\Projects\third-party\aider\.venv\Scripts\python.exe -m flask -A c:\Users\jjafc\.vscode\extensions\lee2py.aider-composer-1.3.0\server\main.py run --port 13491
2024-11-22 14:20:46.201 [info] aider-chat: WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
2024-11-22 14:20:46.201 [info] aider-chat:  * Running on http://127.0.0.1:13491
2024-11-22 14:20:46.202 [info] server started: http://127.0.0.1:13491
2024-11-22 14:20:46.202 [info] aider-chat: Press CTRL+C to quit
2024-11-22 14:20:48.842 [info] aider-chat: 127.0.0.1 - - [22/Nov/2024 14:20:48] "OPTIONS /api/chat/setting HTTP/1.1" 200 -
2024-11-22 14:20:48.845 [info] aider-chat: 127.0.0.1 - - [22/Nov/2024 14:20:48] "POST /api/chat/setting HTTP/1.1" 200 -
2024-11-22 14:20:52.294 [info] From Webview: sendChatMessage: hi
2024-11-22 14:20:52.313 [info] aider-chat: 127.0.0.1 - - [22/Nov/2024 14:20:52] "OPTIONS /api/chat HTTP/1.1" 200 -
2024-11-22 14:20:52.406 [info] aider-chat: 127.0.0.1 - - [22/Nov/2024 14:20:52] "POST /api/chat HTTP/1.1" 200 -
2024-11-22 14:21:24.185 [info] aider-chat: 127.0.0.1 - - [22/Nov/2024 14:21:24] "POST /api/chat HTTP/1.1" 200 -

This is the configuration I am using:

2024-11-22 15_08_47-● aider-composer md - n2y CapExEstimator - Visual Studio Code

I tried hitting the facade server locally to see if it returned any additional information.

curl --request POST \
  --url http://127.0.0.1:13491/api/chat \
  --header 'content-type: application/json' \
  --data '{
  "chat_type": "ask",
  "diff_format": "diff",
  "message": "hi",
  "reference_list": []
}'

This is the response, which doesn't give me much to go on.

event: usage
data: "Tokens: 68 sent, 0 received."

event: end

The following is the PowerShell script that I use to run aider in the terminal and it works as expected.

$env:OLLAMA_API_BASE = "http://192.168.5.15:11434" 
C:\\Projects\\third-party\\aider\\.venv\\Scripts\\aider.exe --model ollama/qwen2.5-coder:32b-instruct-q4_K_M

Is there a way to enable debug logging, or are there logs elsewhere I can look at?

lee88688 commented 4 days ago

you can try latest version, it solve this problem.

jmaerki commented 3 days ago

I had the same problem as jjafuller, and it started out the same after upgrading to 1.3.2, but then I just changed the LLM config again to a silly little "qwen2.5-coder:3b" and it started calling Ollama. However, it's now in an infinite loop producing the same unusable result each time and there's no button to cancel the request. ;-)

lee88688 commented 3 days ago

I had the same problem as jjafuller, and it started out the same after upgrading to 1.3.2, but then I just changed the LLM config again to a silly little "qwen2.5-coder:3b" and it started calling Ollama. However, it's now in an infinite loop producing the same unusable result each time and there's no button to cancel the request. ;-)

In aider you can use ctrl+c to cancel, but in code, it seems there's no api to do this. I'll open an issue to aider to help.