Closed tadq closed 5 months ago
Hi, each model we support is reviewed by the team and tested before being labelled as supported. The main problem is that all models do not perform certain operations well or in some cases not at all.
We do plan to support Llama3 models once CodeLlama3 is available, we've already tested with Llama3 and it's performance is sub-par compared to other coding specific models such as Deepseek and Magicoder (DS only) in terms of code completion.
However, we are able to add Chat support for it but again it's not a code focused model so it won't necessarily give you the best results depending on how you're using it.
I will test adding it today for Chat (since code completion is currently pretty terrible), and will test: https://ollama.com/wojtek/magicoder:6.7b-s-ds-q8_0
Which performs very well - https://huggingface.co/spaces/mike-ravkine/can-ai-code-results
Handled by release v0.3.2 https://github.com/RussellCanfield/wingman-ai/releases/tag/v0.3.2
I am using DeepSeek-Coder for both Go and Rust development. It is great for Go but not great for Rust. Thanks for quick turn around.
Remove restriction and allow other values in
chatModel
andcodeModel
inside "Wingman.Ollama". In version v0.3.1 only specific models allowed. This is very limiting.Allow use of llama3 models.