Open Erotemic opened 1 month ago
Yeah, you have to run a local server through ollama, e.g. ollama run deepseek-coder
. I'm planning to allow different frameworks to be used (e.g., making ip/ports configurable), what framework are you using?
Nothing yet, I just want something I can control via Python and vim. I would like to run the model locally. I'll look into the ollama ecosystem, and perhaps provide PRs to help with configuration.
Hi @Erotemic, I implemented a basic config mechanism (a function that returns a dict with the configuration)
The current instructions seem incomplete. Looking at the code, it seems like you need to be running a local server on a special port. It's unclear to me exactly what needs to be done, so input here would be helpful.