Open machisuji opened 2 weeks ago
Hello, I think this was my mistake by updating the default chat path to /api/chat
when it should be /v1/chat/completions
I just fixed it in the most recent version. For FIM the path /api/generate
should be working.
Describe the bug Try using twinny with deepseek-coder-v2:16b via ollama, but both chat and FIM don't seem to work due to the following errors seen in the ollama logs:
To Reproduce
Expected behavior It should work
Screenshots If applicable, add screenshots to help explain your problem.
Logging Rnable logging in the extension settings if not already enabled (you may need to restart vscode if you don't see logs). Proivide the log with the report.
API Provider ollama
Chat or Auto Complete? both chat and fim
Model Name Provide the model name you are using e.g
codellama:7b-code
orcodellama:7b-instruct
Desktop (please complete the following information):
Additional context Add any other context about the problem here.