Open bmorrisondev opened 3 months ago
@bmorrisondev Did you have a chance to try out adding the remote machine using the "custom models" command?
LMK if that worked or not when you get the chance!
I have not yet! I probably won’t be able to try this for another two weeks to be honest, but I’ll report back when I do!
— Brian Morrison II 708-341-2229 brianmorrison.mehttps://brianmorrison.me [X]https://www.youtube.com/channel/UCLx9EihBDfoJMncRWSZZoXg
On Jul 22, 2024, at 7:20 AM, Jake Colling @.***> wrote:
@bmorrisondevhttps://github.com/bmorrisondev Did you have a chance to try out adding the remote machine using the "custom models" command?
LMK if that worked or not when you get the chance!
— Reply to this email directly, view it on GitHubhttps://github.com/jcollingj/caret/issues/5#issuecomment-2243083935, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABDXR7USM6YAKPKVBV77OKDZNUIKLAVCNFSM6AAAAABLBICT2SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENBTGA4DGOJTGU. You are receiving this because you were mentioned.Message ID: @.***>
@jcollingj Choosing Custom throws an error because context_window
is undefined.
However all I wanted was to use Llama 3.1 so I just replaced llama3 in a couple places, then came here to check if anyone has brought up using other models. Maybe the custom option covers that too?
If not, it'd be a nice feature to have.
Love the plugin regardless, it's intuitive and nice.
@bmorrisondev Maybe ollama's default IP can be just quickly replaced with your remote IP in main.js? At least changing the model was easy and fast.
@vari0nce Can you send a screenshot of what you mean? I do need to add native support for llama 3.1 and similar in ollama! I'll do that now.
But would also appreciate some insight into that context_window undefined so I can fix it!
@vari0nce Can you send a screenshot of what you mean? I do need to add native support for llama 3.1 and similar in ollama! I'll do that now.
But would also appreciate some insight into that context_window undefined so I can fix it!
Sorry for taking so long to answer.
Steps to reproduce:
Error:
plugin:caret:68062 Uncaught (in promise) TypeError: Cannot read properties of undefined (reading 'context_window')
at eval (plugin:caret:68062:127)
at HTMLSelectElement.<anonymous> (app.js:1:1489332)
So apparently context_window
doesn't just have a default value, at least for the Custom-option, and the error breaks the rest.
@vari0nce I was recently working in that space and that might have solved the issue. Could you try it again and let me know? And if that doesn't work, could you open another issue and copy/paste your latest comment to that?
Just so that this thread stays focused on the remote ollama and I can track the issue better.
would it be possible to add a base path for ollama, please? maybe similar to smart second brain or other plugins that use ollama.
the route "/api/tags" delivers all the models to populate the list.
@oppenheimer- Could you make this a separate issue by please? Also a questions for you - Does that /api/tags offer meta data about the models? Like context window size, whether they support function calling, etc. That meta data is needed in the plugin, hence having to add models manually (at least currently)
I have Ollama running on a separate machine from my daily driver. Would love to have a setting to send messages to a remote machine via an IP address.