Closed daniel-a-diaz closed 4 months ago
Ollama just added DeepSeek Coder https://github.com/jmorganca/ollama/issues/1040#event-11017917085
It will come in the next version, probably next week
DeepSeek is super! Please give possibility to select up to the 33b model. Are you going to add this model in Code GPT.Autocomplete: Provider too?
DeepSeek-coder is now available in version 2.2.5 🙌
So I tried using deepseek-coder:33b but can't get an answer out of it. I think there is something wrong with the way the prompts are getting input into the model. It keeps talking about "###" symbols, which I didn't include in the prompt.
It might be something more general with the 2.2.5 release, or maybe just the ollama models. I just switched it to wizardcoder:34b-python to ask the same question, which before had answered the prompt great, and it responded with this. And that is a temperature of 0.2, so it should have been more on target. Or maybe I'm doing something wrong.
@daniel-a-diaz
It is already fixed in version 2.2.6
please add the 6.7b-instruct model as well. per the big code llm leaderboard, its the best performing 7b parameter model:
big Code Models Leaderboard - a Hugging Face Space by bigcode
DeepSeek Coder seems to be the next big model that is performing better than GPT-3.5-Turbo and CodeLlama-34B. Could you please add them to your supported models? Also, does there happen to be a way to add any model for use with CodeGPT?