Closed bitlgy closed 1 month ago
In the above pic, codellama (on the local wsl running ubuntu 18.04) has reponsed code-gpt's request, but code-gpt show a error "fail to fetch the chat response".
Hello! please ensure that Ollama is running locally and download the model that you are using
@bitlgy I made a procedure on WSL here: https://medium.com/p/881b91ba193e. Hope works for you