jcollingj / caret

Caret, an Obsidian Plugin
https://caretplugin.ai/
MIT License
112 stars 10 forks source link

Allow use of remote Ollama instance #5

Open bmorrisondev opened 3 months ago

bmorrisondev commented 3 months ago

I have Ollama running on a separate machine from my daily driver. Would love to have a setting to send messages to a remote machine via an IP address.

jcollingj commented 3 months ago

@bmorrisondev Did you have a chance to try out adding the remote machine using the "custom models" command?

LMK if that worked or not when you get the chance!

bmorrisondev commented 3 months ago

I have not yet! I probably won’t be able to try this for another two weeks to be honest, but I’ll report back when I do!

— Brian Morrison II 708-341-2229 brianmorrison.mehttps://brianmorrison.me [X]https://www.youtube.com/channel/UCLx9EihBDfoJMncRWSZZoXg

On Jul 22, 2024, at 7:20 AM, Jake Colling @.***> wrote:

@bmorrisondevhttps://github.com/bmorrisondev Did you have a chance to try out adding the remote machine using the "custom models" command?

LMK if that worked or not when you get the chance!

— Reply to this email directly, view it on GitHubhttps://github.com/jcollingj/caret/issues/5#issuecomment-2243083935, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABDXR7USM6YAKPKVBV77OKDZNUIKLAVCNFSM6AAAAABLBICT2SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENBTGA4DGOJTGU. You are receiving this because you were mentioned.Message ID: @.***>

vari0nce commented 3 months ago

@jcollingj Choosing Custom throws an error because context_window is undefined.

However all I wanted was to use Llama 3.1 so I just replaced llama3 in a couple places, then came here to check if anyone has brought up using other models. Maybe the custom option covers that too?

If not, it'd be a nice feature to have.

Love the plugin regardless, it's intuitive and nice.

@bmorrisondev Maybe ollama's default IP can be just quickly replaced with your remote IP in main.js? At least changing the model was easy and fast.

jcollingj commented 3 months ago

@vari0nce Can you send a screenshot of what you mean? I do need to add native support for llama 3.1 and similar in ollama! I'll do that now.

But would also appreciate some insight into that context_window undefined so I can fix it!

vari0nce commented 3 months ago

@vari0nce Can you send a screenshot of what you mean? I do need to add native support for llama 3.1 and similar in ollama! I'll do that now.

But would also appreciate some insight into that context_window undefined so I can fix it!

Sorry for taking so long to answer.

Steps to reproduce:

  1. Download and enable Caret.
  2. Select Custom LLM Provider from the plugin menu.
  3. Nothing apart from the console error happens. (I'm assuming this is when the input fields for it should show up.)
  4. Tried with regularly used vault and a sandbox.

Error:

plugin:caret:68062 Uncaught (in promise) TypeError: Cannot read properties of undefined (reading 'context_window')
    at eval (plugin:caret:68062:127)
    at HTMLSelectElement.<anonymous> (app.js:1:1489332)

So apparently context_window doesn't just have a default value, at least for the Custom-option, and the error breaks the rest.

jcollingj commented 2 months ago

@vari0nce I was recently working in that space and that might have solved the issue. Could you try it again and let me know? And if that doesn't work, could you open another issue and copy/paste your latest comment to that?

Just so that this thread stays focused on the remote ollama and I can track the issue better.

oppenheimer- commented 1 month ago

would it be possible to add a base path for ollama, please? maybe similar to smart second brain or other plugins that use ollama.

image

the route "/api/tags" delivers all the models to populate the list.

jcollingj commented 4 weeks ago

@oppenheimer- Could you make this a separate issue by please? Also a questions for you - Does that /api/tags offer meta data about the models? Like context window size, whether they support function calling, etc. That meta data is needed in the plugin, hence having to add models manually (at least currently)