sourcegraph / cody

Type less, code more: Cody is an AI code assistant that uses advanced search and codebase context to help you write and fix code.
https://cody.dev
Apache License 2.0
2.82k stars 305 forks source link

bug: "Use Cody offline with Ollama" button not working #5572

Open jp0707 opened 2 months ago

jp0707 commented 2 months ago

Version

1.34.2

Describe the bug

"Use Cody offline with Ollama" button does not do anything from the sign-in screens.

Steps to reproduce:

  1. Install Cody VSCode extension afresh
  2. Set-up Ollama (guide)
  3. Go offline (disconnect internet)
  4. Restart VSCode
  5. Open Cody left side panel in VSCode. This should show the "Cody could not start due to a connection issue." screen with "Use Cody offline with Ollama"
  6. Click on "Use Cody offline with Ollama"
Screenshot 2024-09-13 at 2 42 54 PM

Expected behavior

Should be taken to appropriate UI to start using Cody offline with Ollama

Additional context

Debug Logs

█ AuthProvider:init:lastEndpoint: Token recovered from secretStorage https://sourcegraph.com/
█ CodyLLMConfiguration: {}
█ ModelsService: Setting primary models: []
█ telemetry-v2: recordEvent: cody.auth/failed  {
  "parameters": {
    "version": 0
  },
  "timestamp": "2024-09-13T18:39:45.058Z"
}
█ featureflag: refreshed
█ ContextFiltersProvider: fetchContextFilters  {}
█ ChatsController:constructor: init
█ CodyCompletionProvider:notSignedIn: You are not signed in.
█ CodyCompletionProvider:notSignedIn: You are not signed in.
█ ChatController: updateViewConfig  {
  "agentIDE": "VSCode",
  "agentExtensionVersion": "1.34.2",
  "uiKindIsWeb": false,
  "serverEndpoint": "https://sourcegraph.com/",
  "experimentalNoodle": false,
  "smartApply": true,
  "webviewType": "sidebar",
  "multipleWebviewsEnabled": true,
  "internalDebugContext": false
}
█ ChatController: updateViewConfig  {
  "agentIDE": "VSCode",
  "agentExtensionVersion": "1.34.2",
  "uiKindIsWeb": false,
  "serverEndpoint": "https://sourcegraph.com/",
  "experimentalNoodle": false,
  "smartApply": true,
  "webviewType": "sidebar",
  "multipleWebviewsEnabled": true,
  "internalDebugContext": false
}
█ ChatController: updateViewConfig  {
  "agentIDE": "VSCode",
  "agentExtensionVersion": "1.34.2",
  "uiKindIsWeb": false,
  "serverEndpoint": "https://sourcegraph.com/",
  "experimentalNoodle": false,
  "smartApply": true,
  "webviewType": "sidebar",
  "multipleWebviewsEnabled": true,
  "internalDebugContext": false
}
█ ContextFiltersProvider: fetchContextFilters  {}
█ UpstreamHealth: Failed to ping upstream host  {
  "error": {
    "message": "request to https://sourcegraph.com/healthz failed, reason: getaddrinfo ENOTFOUND sourcegraph.com",
    "type": "system",
    "errno": "ENOTFOUND",
    "code": "ENOTFOUND"
  }
}
█ ContextFiltersProvider: fetchContextFilters  {}
jp0707 commented 2 months ago

This currently blocking me from using Cody with Ollama. Is there any other way to bypass the log-in screen so I can start using Cody with local Ollama?

sascharo commented 2 months ago

This currently blocking me from using Cody with Ollama. Is there any other way to bypass the log-in screen so I can start using Cody with local Ollama?

What about switching to an offline model after you logged in?

jyoti-re-qr commented 2 months ago

What about switching to an offline model after you logged in?

But that requires me to log in in the first place. I'd like to avoid having to log in just to use an offline model :)