twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.36k stars 130 forks source link

Twinny stops working after machine goes in standby #166

Closed onel closed 2 months ago

onel commented 4 months ago

Describe the bug Not sure if this is twinny or an ollama issue but the chat feature seems to stop working after the machine goes in stand by. Using M1 mac air Ventura 13.4

To Reproduce Steps to reproduce the behavior:

  1. Start ollama with ollama run ...
  2. VS code + twinny running normally
  3. Close the laptop lid
  4. Open back up
  5. ollama is still responsive if using the terminal
  6. Sending a message through twinny in VS code shows the loading indicator indefinitely

Expected behavior Twinny chat should continue working

Restarting VS code, ollama doesn't seem to fix the problem so I'm wondering if it's also something else. Ollama works ok through the CLI

rjmacarthy commented 4 months ago

Hmm, I have not experienced this issue.

I don't have a laptop/macbook so unable to test what might be happening, does the Ollama api respond separately?

rjmacarthy commented 2 months ago

Stale