Closed endlessblink closed 2 weeks ago
The feature of using local models for summarization through Ollama instead of Claude's API, could be really helpful.
Added ollama support in https://github.com/thewh1teagle/vibe/releases/tag/v2.6.5
Describe the feature
The feature of using local models for summarization through Ollama instead of Claude's API, could be really helpful.