Closed iplayfast closed 9 months ago
To use local models with ollama a sample configuration is config.yaml
llm_api_key: no need llm_base_url: http://localhost:11434 llm_custom_provider: null llm_model: ollama/mistral
Very cool project
Thanks, I updated the readme.
To use local models with ollama a sample configuration is config.yaml
Very cool project