getsavvyinc / savvy-cli

Automatically capture and surface your team's tribal knowledge
https://getsavvy.so
MIT License
311 stars 10 forks source link

Local LLMs Support #127

Open av opened 3 months ago

av commented 3 months ago

Hi, thanks for building and opening Savvy!

Is there any way I can configure it to use a locally-running LLM? With OpenAI-compatible API or otherwise.

Thanks!

joshi4 commented 3 months ago

Curious about your use case for local LLMs vs OpenAI ?

av commented 3 months ago

Usage in no-network conditions, data protection, ability to choose specific models for specific kind of workload (for example fine-tuned)

Nothing unique, just a "local LLM" use-case

joshi4 commented 3 months ago

Thanks for sharing!

Local Redaction and support for local LLMs is planned and I'm tracking it on our public feedback board here

joshi4 commented 3 months ago

Hi @av,

quick update:

I've moved away from using OpenAI for generating runbooks and we now use Llama3.1 hosted on Groq.

Savvy ask/explain still uses GPT4-o for now.