ex3ndr / llama-coder

Replace Copilot local AI
https://marketplace.visualstudio.com/items?itemName=ex3ndr.llama-coder
MIT License
1.7k stars 122 forks source link

Can't use custom model #44

Closed HelmholtzW closed 6 months ago

HelmholtzW commented 7 months ago

I want to use a custom model via Ollama, to which I gave the name copilot. I did ollama create copilot -f copilot and running ollama run copilot works in the terminal, and other extensions also work well with it. However, when I start Llama Coder and set the custom model to 'copilot,' the extension tells me that copilot wasn't downloaded and asks if I want to download it. If I click on 'No,' nothing happens. Looking at the logs, I see "Ingoring since the user asked to ignore download.".

How does llama-coder check if a model exists with ollama?

How can I use this model, as I already manually created it using the gguf file and ollama create? Thanks in advance!

HelmholtzW commented 7 months ago

Sorry, I found the error. Had to specify copilot:latest as the custom models name.

super-crayfish commented 6 months ago

close it