Open lestan opened 9 months ago
I use Ollama as my inference server for local LLMs. Ollama is supported by many LLM frameworks, but not Guidance.
Would love to see a direct integration with Ollama via the models package.
I'm aware that LiteLLM support is available and can be used to proxy Ollama, but that adds overhead and makes the solution more complex.
Supporting Ollama would immediately enable support for all the models Ollama makes available: model library
May I ask if any updates?
See also:
I use Ollama as my inference server for local LLMs. Ollama is supported by many LLM frameworks, but not Guidance.
Would love to see a direct integration with Ollama via the models package.
I'm aware that LiteLLM support is available and can be used to proxy Ollama, but that adds overhead and makes the solution more complex.
Supporting Ollama would immediately enable support for all the models Ollama makes available: model library