Open bgreene2 opened 7 months ago
Thanks - this works well! However I've been playing downstream with using grammars (GBNF), and Ollama is not there yet for guided generation.
I might try looking at Outlines instead of GBNF, they integrate with llama.cpp, but unsure about ollama.
[thinking in progress]
This adds the ability to point at a model hosted on an Ollama installation instead of running the model within the app.
Tested on my system, which is a PC running Windows 11 and WSL. The Ollama installation is in Windows, and the app was run in WSL.