Closed Kidlike closed 3 months ago
@Kidlike Ollama support will be in the next release. Expect this to roll out within the next week or so
@Kidlike This is now supported in version 1.2.0. You can use Ollama and have your data never leave your computer
Add a config to target self hosted models.
https://localai.io/