carlrobertoh / CodeGPT

The leading open-source AI copilot for JetBrains. Connect to any model in any environment, and customize your coding experience in any way you like.
https://codegpt.ee
Apache License 2.0
1.01k stars 213 forks source link

Error "Something went wrong" when sending code completion request to Ollama instance #684

Closed Majroch closed 3 weeks ago

Majroch commented 4 weeks ago

What happened?

After update to new version of plugin in PhpStorm, when I write something in editor with code completion enabled, then error is thrown. Same goes for Chat window.

Only exception is when fetching models list from instance -> request is sent and valid list of available models is fetched.

Relevant log output or stack trace

Something went wrong

java.net.ConnectException
    at java.net.http/jdk.internal.net.http.HttpClientImpl.send(HttpClientImpl.java:951)
    at java.net.http/jdk.internal.net.http.HttpClientFacade.send(HttpClientFacade.java:133)
    at ee.carlrobert.llm.client.ollama.OllamaClient.processStreamRequest(OllamaClient.java:222)
    at ee.carlrobert.llm.client.ollama.OllamaClient.lambda$getCompletionAsync$2(OllamaClient.java:79)
    at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)
    at java.base/java.util.concurrent.CompletableFuture$AsyncRun.exec(CompletableFuture.java:1796)
    at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:507)
    at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1491)
    at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:2073)
    at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:2035)
    at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:187)
Caused by: java.net.ConnectException
    at java.net.http/jdk.internal.net.http.common.Utils.toConnectException(Utils.java:1028)
    at java.net.http/jdk.internal.net.http.PlainHttpConnection.connectAsync(PlainHttpConnection.java:227)
    at java.net.http/jdk.internal.net.http.PlainHttpConnection.checkRetryConnect(PlainHttpConnection.java:280)
    at java.net.http/jdk.internal.net.http.PlainHttpConnection.lambda$connectAsync$2(PlainHttpConnection.java:238)
    at java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:934)
    at java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:911)
    at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510)
    at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1773)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
    at java.base/java.lang.Thread.run(Thread.java:1583)
Caused by: java.nio.channels.ClosedChannelException
    at java.base/sun.nio.ch.SocketChannelImpl.ensureOpen(SocketChannelImpl.java:202)
    at java.base/sun.nio.ch.SocketChannelImpl.beginConnect(SocketChannelImpl.java:786)
    at java.base/sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:874)
    at java.net.http/jdk.internal.net.http.PlainHttpConnection.lambda$connectAsync$1(PlainHttpConnection.java:210)
    at java.base/java.security.AccessController.doPrivileged(AccessController.java:571)
    at java.net.http/jdk.internal.net.http.PlainHttpConnection.connectAsync(PlainHttpConnection.java:212)
    ... 9 more

Steps to reproduce

  1. Install newest update
  2. Enable code completion with Ollama instance configured
  3. Write something in Chat or in open file

CodeGPT version

2.11.0-241.1

Operating System

Linux

Kenterfie commented 3 weeks ago

The problem currently prevents any use of this plugin and downgrades no longer work due to the new Intellij version.

carlrobertoh commented 3 weeks ago

It looks like the host can't be overridden since the last version. So, if your Ollama server is running on a port other than 11434, the connection will fail.

I will fix this in the next release. As a workaround, please run the server on the default port (11434).

Kenterfie commented 3 weeks ago

This can not be the reason. In my case the port is already the default port.

I have currently http://llm.local.net:11434

carlrobertoh commented 3 weeks ago

It looks like the host can't be overridden

The same issue is related to your use case as well. 6b7e26

Majroch commented 3 weeks ago

It looks like the host can't be overridden since the last version. So, if your Ollama server is running on a port other than 11434, the connection will fail.

Ok, my instance is at 443, so this can be possible. I use Nginx as my proxy between Ollama instance and PhpStorm.

I'll wait for new release and check out, if it works :)

tomsykes commented 3 weeks ago

I've just tried installing CodeGPT, and came across this issue. My ollama instance is hosted remotely and I was getting the "something went wrong" message.

I've used an ssh tunnel (from localhost:11434 to remote:port) to resolve for now.

carlrobertoh commented 3 weeks ago

This issue should be fixed in the latest version (2.11.1). Please reopen the ticket if it is still reproducible.

Majroch commented 3 weeks ago

I can confirm, everything works after update to 2.11.1 :D

jwsims commented 3 weeks ago

I'm not sure if this is related to the fix or not, but has anyone else noticed that Ollama ignores the timeout setting in Advanced Settings?