DonTizi / ReMind

Your Local Artificial Memory on your Device.
https://www.recallmemory.io/
Apache License 2.0
426 stars 23 forks source link

Remote Ollama host possible? #15

Open zheroz00 opened 1 month ago

zheroz00 commented 1 month ago

This is just what I've been looking for. Is it possible to use a remote host for Ollama? I tried changing the localhost:11434 to my remote IP:11434 but I receive an error when I try to chat or pull a model.

at async DevServer.handleRequestImpl (V:\git\ReMind\node_modules\next\dist\server\base-server.js:812:17) at async V:\git\ReMind\node_modules\next\dist\server\dev\next-dev-server.js:339:20 at async Span.traceAsyncFn (V:\git\ReMind\node_modules\next\dist\trace\trace.js:154:20) at async DevServer.handleRequest (V:\git\ReMind\node_modules\next\dist\server\dev\next-dev-server.js:336:24) at async invokeRender (V:\git\ReMind\node_modules\next\dist\server\lib\router-server.js:173:21) at async handleRequest (V:\git\ReMind\node_modules\next\dist\server\lib\router-server.js:350:24) at async requestHandlerImpl (V:\git\ReMind\node_modules\next\dist\server\lib\router-server.js:374:13) at async Server.requestListener (V:\git\ReMind\node_modules\next\dist\server\lib\start-server.js:141:13) { [cause]: AggregateError [ECONNREFUSED]: at internalConnectMultiple (node:net:1118:18) at afterConnectMultiple (node:net:1685:7) at TCPConnectWrap.callbackTrampoline (node:internal/async_hooks:130:17) { code: 'ECONNREFUSED', [errors]: [ [Error], [Error] ] } } POST /api/model 500 in 1366ms

Thanks

ztnewman commented 2 weeks ago

Try setting NEXT_PUBLIC_OLLAMA_URL to your ollama URL in .env or .env.local.