vmware-archive / octant

Highly extensible platform for developers to better understand the complexity of Kubernetes clusters.
https://octant.dev
Apache License 2.0
6.28k stars 486 forks source link

Unable to switch to more than 1 context without crashing #3350

Open sarman-tftsr opened 1 year ago

sarman-tftsr commented 1 year ago

What steps did you take and what happened: [A clear and concise description of what the bug is, and what commands you ran.) I am needing to administer multiple Kubernetes clusters. If I switch contexts more than 1 or 2 times, Octant will either crash (Launched by CLI), or simply hang (Launched by GUI). When this happens my only action is to close, if launched by GUI, and relaunch the program to get to the context I need.

What did you expect to happen: The application should be able to switch contexts without locking up like this. Here are the console output/logs when launched via CLI (Partial due to char. limit):

2022-10-07T08:51:30.909-0500    ERROR   api/content_manager.go:159  generate content    {"client-id": "cdbb5ed8-4646-11ed-a414-f01898e82da4", "err": "generate content: preferred version for StreamTemplate.jetstream.nats.io: unknown version for StreamTemplate.jetstream.nats.io", "content-path": "overview/namespace/apollo-dev"}
github.com/vmware-tanzu/octant/internal/api.(*ContentManager).runUpdate.func1
    github.com/vmware-tanzu/octant/internal/api/content_manager.go:159
github.com/vmware-tanzu/octant/internal/api.(*InterruptiblePoller).Run.func1
    github.com/vmware-tanzu/octant/internal/api/poller.go:86
github.com/vmware-tanzu/octant/internal/api.(*InterruptiblePoller).Run
    github.com/vmware-tanzu/octant/internal/api/poller.go:95
github.com/vmware-tanzu/octant/internal/api.(*ContentManager).Start
    github.com/vmware-tanzu/octant/internal/api/content_manager.go:133
E1007 08:51:31.155137   25950 runtime.go:78] Observed a panic: "close of closed channel" (close of closed channel)
goroutine 73582 [running]:
k8s.io/apimachinery/pkg/util/runtime.logPanic({0x58a9c40, 0xb692470})
    k8s.io/apimachinery@v0.21.3/pkg/util/runtime/runtime.go:74 +0x85
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x404a3d1})
    k8s.io/apimachinery@v0.21.3/pkg/util/runtime/runtime.go:48 +0x75
panic({0x58a9c40, 0xb692470})
    runtime/panic.go:1038 +0x215
k8s.io/client-go/tools/cache.(*processorListener).pop(0xc002381500)
    k8s.io/client-go@v0.21.3/tools/cache/shared_informer.go:752 +0x287
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
    k8s.io/apimachinery@v0.21.3/pkg/util/wait/wait.go:73 +0x5a
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
    k8s.io/apimachinery@v0.21.3/pkg/util/wait/wait.go:71 +0x88
E1007 08:51:31.155132   25950 runtime.go:78] Observed a panic: "close of closed channel" (close of closed channel)
goroutine 73556 [running]:
k8s.io/apimachinery/pkg/util/runtime.logPanic({0x58a9c40, 0xb692470})
    k8s.io/apimachinery@v0.21.3/pkg/util/runtime/runtime.go:74 +0x85
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x404a3d1})
    k8s.io/apimachinery@v0.21.3/pkg/util/runtime/runtime.go:48 +0x75
panic({0x58a9c40, 0xb692470})
    runtime/panic.go:1038 +0x215
k8s.io/client-go/tools/cache.(*processorListener).pop(0xc002381500)
    k8s.io/client-go@v0.21.3/tools/cache/shared_informer.go:752 +0x287
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
    k8s.io/apimachinery@v0.21.3/pkg/util/wait/wait.go:73 +0x5a
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
    k8s.io/apimachinery@v0.21.3/pkg/util/wait/wait.go:71 +0x88
panic: close of closed channel [recovered]
    panic: close of closed channel

goroutine 73582 [running]:
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x404a3d1})
    k8s.io/apimachinery@v0.21.3/pkg/util/runtime/runtime.go:55 +0xd8
panic({0x58a9c40, 0xb692470})
    runtime/panic.go:1038 +0x215
k8s.io/client-go/tools/cache.(*processorListener).pop(0xc002381500)
    k8s.io/client-go@v0.21.3/tools/cache/shared_informer.go:752 +0x287
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
    k8s.io/apimachinery@v0.21.3/pkg/util/wait/wait.go:73 +0x5a
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
    k8s.io/apimachinery@v0.21.3/pkg/util/wait/wait.go:71 +0x88
panic: close of closed channel [recovered]
    panic: close of closed channel

goroutine 73556 [running]:
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x404a3d1})
    k8s.io/apimachinery@v0.21.3/pkg/util/runtime/runtime.go:55 +0xd8
panic({0x58a9c40, 0xb692470})
    runtime/panic.go:1038 +0x215
k8s.io/client-go/tools/cache.(*processorListener).pop(0xc002381500)
    k8s.io/client-go@v0.21.3/tools/cache/shared_informer.go:752 +0x287
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
    k8s.io/apimachinery@v0.21.3/pkg/util/wait/wait.go:73 +0x5a
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
    k8s.io/apimachinery@v0.21.3/pkg/util/wait/wait.go:71 +0x88

Anything else you would like to add: [Miscellaneous information that will assist in solving the issue.] Console output posted above ^^^

Environment: CLI version of Octant:

octant version
Version:  0.25.1
Git commit:  f16cbb951905f1f8549469dfc116ca16cf679d46
Built:  2022-02-24T21:59:56Z

GUI application of Actant:

Version | (dev-version)
-- | --
f16cbb9
2022-02-24T22:39:43Z

KUBECTL Version:

Client Version: version.Info{Major:"1", Minor:"24", GitVersion:"v1.24.3", GitCommit:"aef86a93758dc3cb2c658dd9657ab4ad4afc21cb", GitTreeState:"clean", BuildDate:"2022-07-13T14:30:46Z", GoVersion:"go1.18.3", Compiler:"gc", Platform:"darwin/amd64"}
Kustomize Version: v4.5.4
Server Version: version.Info{Major:"1", Minor:"23", GitVersion:"v1.23.6+k3s1", GitCommit:"418c3fa858b69b12b9cefbcff0526f666a6236b9", GitTreeState:"clean", BuildDate:"2022-04-28T22:16:18Z", GoVersion:"go1.17.5", Compiler:"gc", Platform:"linux/amd64"}

OS:

Mac OSX 12.6