ollama / ollama-js

Ollama JavaScript library
https://ollama.ai
MIT License
1.56k stars 103 forks source link

localhost is not working for custom client with node, but 127.0.0.1 does #102

Closed balazskoti closed 2 weeks ago

balazskoti commented 2 weeks ago

Custom client is not working when I'm using localhost in the host parameter, like stated in the documentation: const ollama = new Ollama({ host: 'http://localhost:11434' })

I'm getting the following trace:

 Trace: Ollama probe failed TypeError: fetch failed
    at Object.fetch (node:internal/deps/undici/undici:11576:11)api\node_modules\ollama\dist\shared\ollama.1e233bce.cjs:114:20)
    at Ollama.processStreamableRequest (xxx\node_modules\ollama\dist\shared\ollama.1e233bce.cjs:251:22)
      at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1495:16) {
    errno: -4078,
    code: 'ECONNREFUSED',
    syscall: 'connect',
    address: '::1',
    port: 11434
  }
}
    at processTicksAndRejections (node:internal/process/task_queues:95:5)

It does work however if I use 127.0.0.1 instead of localhost.

Node version 18.17.0 Platform: Windows 10

At least the documentation should be updated, not sure about the resolution. I'm using the default fetch from node.

Regards,

BruceMacD commented 2 weeks ago

This for the report, this example should have used the loopback address rather than the hostname. Fixed in 57fafae5d5e79e78f0c3abdcd2e18e7ff5fd1329