AbanteAI / rawdog

Generate and auto-execute Python scripts in the cli
Apache License 2.0
1.79k stars 140 forks source link

Update readme #27

Closed iplayfast closed 9 months ago

iplayfast commented 9 months ago

To use local models with ollama a sample configuration is config.yaml

llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mistral

Very cool project

jakethekoenig commented 9 months ago

Thanks, I updated the readme.