Closed Rambou closed 7 months ago
Hi, unfortunately, we don't have that plan at the moment.
@Rambou Hi, I have another project that supports ollama (but it's not an intellij plugin) that can be found here if interested https://github.com/obiscr/ollama-ui thanks!
Ollama lets you run a local server with the model of your choice. Could you support that?