Custom client is not working when I'm using localhost in the host parameter, like stated in the documentation:
const ollama = new Ollama({ host: 'http://localhost:11434' })
I'm getting the following trace:
Trace: Ollama probe failed TypeError: fetch failed
at Object.fetch (node:internal/deps/undici/undici:11576:11)api\node_modules\ollama\dist\shared\ollama.1e233bce.cjs:114:20)
at Ollama.processStreamableRequest (xxx\node_modules\ollama\dist\shared\ollama.1e233bce.cjs:251:22)
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1495:16) {
errno: -4078,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '::1',
port: 11434
}
}
at processTicksAndRejections (node:internal/process/task_queues:95:5)
It does work however if I use 127.0.0.1 instead of localhost.
Node version 18.17.0
Platform: Windows 10
At least the documentation should be updated, not sure about the resolution. I'm using the default fetch from node.
Custom client is not working when I'm using localhost in the host parameter, like stated in the documentation:
const ollama = new Ollama({ host: 'http://localhost:11434' })
I'm getting the following trace:
It does work however if I use
127.0.0.1
instead of localhost.Node version 18.17.0 Platform: Windows 10
At least the documentation should be updated, not sure about the resolution. I'm using the default fetch from node.
Regards,