Open boswelja opened 3 days ago
It looks like none of the models I previously downloaded support this new logic (except the models you listed). Could we keep the FIM template configuration and make it optional instead?
With this, we can also support repository-level completions before they implement it on their end.
The Ollama
/api/generate
API supports prefix + suffix now, let's use that.Note that there isn't currently a proper API for context files, I opened https://github.com/ollama/ollama/issues/7738 off the back of this. For now, I have hardcoded the deepseek-coder context format, and it seems to work OK across codeqwen2.5 and deepseek-coder, though it's probably not ideal
Waiting for https://github.com/carlrobertoh/CodeGPT/pull/771