JudiniLabs / code-gpt-docs

Docusaurus page
https://code-gpt-docs.vercel.app
MIT License
569 stars 59 forks source link

It's not working with ollama #239

Open vfraloo opened 6 months ago

vfraloo commented 6 months ago

I'm using this extension in VSCodium, but it doesn't seem to be working properly. I have set up the Provider and model, and I can also see API logs in ollama's server.log, but there is no response when I chat or use auto-complete features.

logs like these [GIN] 2024/03/08 - 10:14:33 | 404 | 505.1µs | 127.0.0.1 | POST "/api/chat" [GIN] 2024/03/08 - 10:15:15 | 404 | 504.4µs | 127.0.0.1 | POST "/api/generate"

image

davila7 commented 6 months ago

Try updating the extension to version 3.2.3 and restarting Ollama

thienedits commented 4 months ago

Try updating the extension to version 3.2.3 and restarting Ollama

I am confused why the dropdown selections for Ollama are not even the correct repository names. @vfraloo I think you will need to type in the correct repository such as "Deepseek-Coder:6.7b-Instruct" instead of just selecting the default selections from the dropdown menu. This is what was so confusing to me at first. What is the point of having the wrong named menu options?

I think the default name of "Deepseek-Coder" runs the default download. You need to be sure you downloaded the default model.

pawlakus commented 4 months ago

See this answer, a simple python asyncio proxy with model name replacement I'm using daily:

https://github.com/davila7/code-gpt-docs/issues/227#issuecomment-2142803287

Edit the MODEL_SWAP to your liking, and replace llama3:8b with any fine-tunes, like ChatQA-v1.5 or any other models.