ex3ndr / llama-coder

Replace Copilot local AI
https://marketplace.visualstudio.com/items?itemName=ex3ndr.llama-coder
MIT License
1.7k stars 122 forks source link

Extension Literally Does Nothing #42

Closed PaulSinghDev closed 7 months ago

PaulSinghDev commented 7 months ago

I have spent some time trying to find a usage guide but failed miserably. I can see the extension is installed and doing something as I can see the spinning wheel in the footer of the VSCode app.

I am also sure I have the model on my machine as I can see it in Ollama and run an instance via ollama run <MODE_NAME>

When go to the output tab of VSCode I can see the following:

2024-02-26 14:21:32.734 [info] Llama Coder is activated.
2024-02-26 14:26:51.699 [info] Canceled after AI completion.
2024-02-26 14:26:52.006 [info] Running AI completion...
2024-02-26 14:27:00.153 [info] Receive line: {"model":"stable-code:3b-code-q4_0","created_at":"2024-02-26T14:27:00.151644Z","response":".","done":false}
2024-02-26 14:27:00.158 [info] AI completion completed: 
2024-02-26 14:27:00.158 [info] Canceled after AI completion.
2024-02-26 14:27:00.160 [info] Canceled before AI completion.
2024-02-26 14:28:34.455 [info] Running AI completion...
2024-02-26 14:28:47.386 [info] Receive line: {"model":"stable-code:3b-code-q4_0","created_at":"2024-02-26T14:28:47.384057Z","response":"","done":true,"total_duration":12925537752,"load_duration":5852886428,"prompt_eval_count":130,"prompt_eval_duration":7072116000,"eval_count":1,"eval_duration":22000}
2024-02-26 14:28:47.388 [info] AI completion completed: 
2024-02-26 14:28:47.388 [info] Canceled after AI completion.
2024-02-26 14:28:47.392 [info] Canceled before AI completion.
2024-02-26 14:28:47.392 [info] Canceled before AI completion.
2024-02-26 14:28:47.394 [info] Canceled before AI completion.
2024-02-26 14:28:47.404 [info] Running AI completion...
2024-02-26 14:28:47.726 [info] Receive line: {"model":"stable-code:3b-code-q4_0","created_at":"2024-02-26T14:28:47.725568Z","response":"\n\n","done":false}
2024-02-26 14:28:47.829 [info] Receive line: {"model":"stable-code:3b-code-q4_0","created_at":"2024-02-26T14:28:47.828662Z","response":"","done":true,"total_duration":422186670,"load_duration":762292,"prompt_eval_count":4,"prompt_eval_duration":317920000,"eval_count":2,"eval_duration":102965000}
2024-02-26 14:28:47.830 [info] AI completion completed: 

2024-02-26 14:28:50.926 [info] Running AI completion...
2024-02-26 14:28:51.185 [info] Receive line: {"model":"stable-code:3b-code-q4_0","created_at":"2024-02-26T14:28:51.184114Z","response":"","done":true,"total_duration":257221422,"load_duration":439644,"prompt_eval_count":3,"prompt_eval_duration":256262000,"eval_count":1,"eval_duration":24000}
2024-02-26 14:28:51.186 [info] AI completion completed: 

However, nothing is ever outputted in the terminal or within the filesystem

PaulSinghDev commented 7 months ago

Ok, I have figured it out it is just very slow on the work machine. Dropping all the tokens down to small numbers did make it generate something so this is more a perf issue than anything else.