Closed thienedits closed 7 months ago
On codegpt 3.2.6 and macOS 13.4 and vscode 1.88.1. It doesn't output anything. Running ollama from terminal works fine.
Hi, I think you guys need to change the dropdown selection names for llama3. I had to type in the correct model name of "llama3:instruct" instead of selecting "Llama3:8b" from model dropdown
On codegpt 3.2.6 and macOS 13.4 and vscode 1.88.1. It doesn't output anything. Running ollama from terminal works fine.