Open cadeff01 opened 5 months ago
my url is using a custom ca but I also have NODE_EXTRA_CA_CERTS=/etc/ssl/certs/ca-certificates.crt set which from my understanding should address any ssl issues with my custom ca.
Got some tests running locally and verified that this is related to the custom ca. Any chance of getting support for custom ca certs?
I get the same warning in VSCode: https://github.com/ex3ndr/llama-coder/issues/3#issuecomment-1917842038 Appreciate any help with it
VSCode as a host controls all connections extensions open and use, so it's not related to Llama Coder specifically.
Have you tried solution from this Stackoverflow question?
I've tried the NODE_EXTRA_CERTS solution as disabling SSL is a really bad idea but that didn't help. For similar plugins like Continue I know they had to add something to support extra certs in the plugin itself for this to work.
I have an ollama container running the stable-code:3b-code-q4_0 model. I'm able to interact with the model via curl:
curl -d '{"model":"stable-code:3b-code-q4_0", "prompt": "c++"}' https://notarealurl.io/api/generate
and get a response in a terminal in wsl where I'm running vscode:
However when I set the Ollama Server Endpoint to https://notarealurl.io/ I just get
[warning] Error during inference: fetch failed