karthink / gptel

A simple LLM client for Emacs
GNU General Public License v3.0
1.64k stars 151 forks source link

Suppport Codestral Mamba #465

Open robert-zaremba opened 3 weeks ago

robert-zaremba commented 3 weeks ago

Codestral, while having a great output quality is really slow.

Codestral Mamba looks to improve without sacrificing the result quality: https://mistral.ai/news/codestral-mamba/

Is there any way we can have support for Codestral Mamba? Codestral Mamba doesn't llama.cpp support (neither ollama). So checking here if there is other viable solution that can be implemented to support Codestral Mamba.

karthink commented 3 weeks ago

First you'd need to deploy the model somehow, neither Emacs nor gptel can help with that. Once it's running gptel can support it if the deployment offers a HTTP API endpoint.