taichimaeda / markpilot

AI-powered inline completions and chat view for Obsidian
MIT License
5 stars 1 forks source link

Feature: Support alternative OpenAI API compatible endpoints #2

Closed cori closed 2 months ago

cori commented 3 months ago

I have two use-cases here

  1. using something like OpenRouter to be able to try lots of different models
  2. using a local model to avoid sending data to a third party, either for security or cost consciousness

Ideally one could configure completions and chat separately.

E.g. i have two models running on a little machine on my local network. one is probably sufficient for completions and the other for slightly more complicated stuff (at the price of some speed). I'd love to be able to send some or all of markpilot's requests to those models.

taichimaeda commented 3 months ago

That's an interesting idea (and should be definitely feasible).

I'm working on a new plugin at the moment so this might take a while but I'll make sure to keep this on the roadmap.

taichimaeda commented 2 months ago

This should be available in the latest release!