carlrobertoh / CodeGPT

JetBrains extension providing access to state-of-the-art LLMs, such as GPT-4, Claude 3, Code Llama, and others, all for free
https://codegpt.ee
Apache License 2.0
885 stars 184 forks source link

Unable to select model when using an Ollama backend. #606

Closed flutas closed 6 days ago

flutas commented 1 week ago

What happened?

When using the Ollama backend, the ComboBox for selecting a model does not display models, instead only displaying "Loading..." indefinitely.

image

This was caused by #601, this change moved the ComboBox model update from being ran by invokeLater to runInEdt.

https://github.com/carlrobertoh/CodeGPT/blob/c55237b72b7fb87450317f30d8169c1e2d3efa95/src/main/kotlin/ee/carlrobert/codegpt/settings/service/ollama/OllamaSettingsForm.kt#L156

Reverting this to invokeLater fixes the issue, but I decided to open an issue rather than a PR, as I don't know the underlying rationale behind moving from invokeLater to runInEdt.

Relevant log output or stack trace

No response

Steps to reproduce

  1. Install CodeGPT
  2. Attempt to configure CodeGPT to use a Ollama backend.
  3. Observe the models box not updating or displaying any models.

CodeGPT version

2.8.2-241.1

Operating System

macOS

dejavuwl commented 6 days ago

same here in Windows 11, models refresh keeps show 'loading' and 'Loading is not available, please select another model' image

xiaoshyang commented 6 days ago

我也遇到 了这个问题

carlrobertoh commented 6 days ago

Hi, thanks for reporting! I will take a look asap. The only difference is that runInEdt uses ModalityState, and if the current thread is already EDT, it won't be triggered again. However, I cannot see how this might affect this.