sourcegraph / cody

AI that knows your entire codebase
https://cody.dev
Apache License 2.0
2.22k stars 213 forks source link

Auth: add offline mode support for Ollama models #4691

Closed abeatrix closed 4 days ago

abeatrix commented 4 days ago

CLOSE https://linear.app/sourcegraph/issue/CODY-2631/cody-requires-an-online-connection-and-an-open-vs-code-to-work-offline

This change adds support for an offline mode that allows users to only use Ollama models with Cody without the internet.

The key changes are:

The screen below only shows up when there is a networking issue, e.g. when Cody does not have internet connection:

image

TO-DO

Test plan

  1. Start Cody from this branch
  2. Open Ollama
  3. Open Chat and ask Cody a question to verify Cody still works when there is an Internet connection
  4. Turn off your Wi-Fi settings
  5. Reload VS Code
  6. Verify you are seeing the connection issue page
  7. Click on Use Cody Offline with Ollama
  8. Verify the sidebar is loaded correctly
  9. Open Chat and select an Ollama instruct model
  10. Ask Cody a question and verify you are getting a response back from the Ollama model

Demo

image
sqs commented 4 days ago

reviewing