Closed ptgoetz closed 5 months ago
Add Ollama as an LLM option.
Ollama allows you to run an LLM model/service locally with minimal effort. This is can be especially important, for example, when demoing the project and derivative projects in a network-disconnected environment.
Initial PR: #276
What?
Add Ollama as an LLM option.
Why?
Ollama allows you to run an LLM model/service locally with minimal effort. This is can be especially important, for example, when demoing the project and derivative projects in a network-disconnected environment.
Implementation Considerations