Closed Aiq0 closed 9 months ago
Nice, looks good to me. Will probably be interesting for a lot of people who want to test out Mistral or other models locally. Thanks to the both of you :)
I now tested it with Ollama and codellama and it works as expected.
Perfection!)
Thank you for your contribution @Aiq0, I'll release this to packagecontrol in a few next days.
Nice, looks good to me. Will probably be interesting for a lot of people who want to test out Mistral or other models locally. Thanks to the both of you :)
Promo among them are more than welcome)
Added support for other LLM with OpenAI compatible API.
Partially fixes #29 Should fix #21 (if running Llama 2 via Ollama)
Done just before school, I will test it later today.
@rubjo @yaroslavyaroslav can you please test it too?