justyns / silverbullet-ai

Plug for SilverBullet to integrate LLM functionality
https://ai.silverbullet.md/
GNU Affero General Public License v3.0
26 stars 1 forks source link

Better support and docs for local models #36

Open justyns opened 2 months ago

justyns commented 2 months ago

Local models are currently supported as long as they expose an openai compatible api. It'd be better to have more documentation supporting this, along with some examples of installing/configuring them. This is pretty important for #34

Two local solutions I'd focus on right now are:

litellm doesn't run models itself, but would be good to document as a proxy to access other 3rd party APIs.

LocalAI seems like it's gaining a lot of new features, but ollama should be pretty simple for anyone to run locally (if they also run silverbullet locally)