open-telemetry / opentelemetry-collector-contrib

Contrib repository for the OpenTelemetry Collector
https://opentelemetry.io
Apache License 2.0
3k stars 2.32k forks source link

Flaky test on k8sclusterreceiver #5973

Open jpkrohling opened 2 years ago

jpkrohling commented 2 years ago

Seen on https://github.com/open-telemetry/opentelemetry-collector-contrib/runs/4031747516?check_suite_focus=true

make[2]: Entering directory '/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/k8sclusterreceiver'
go test -race -timeout 300s --tags="" ./...
E1028 07:57:41.267867   42008 runtime.go:78] Observed a panic: "invalid memory address or nil pointer dereference" (runtime error: invalid memory address or nil pointer dereference)
goroutine 55 [running]:
k8s.io/apimachinery/pkg/util/runtime.logPanic({0x22ad360, 0x3694e00})
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/runtime/runtime.go:74 +0xe6
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc000021738})
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/runtime/runtime.go:48 +0xb0
panic({0x22ad360, 0x3694e00})
    /opt/hostedtoolcache/go/1.17.2/x64/src/runtime/panic.go:1047 +0x266
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch.func1(0xc000198380, 0xc00002c360, 0xc00021c120, 0xc000021988)
    /home/runner/go/pkg/mod/k8s.io/client-go@v0.22.2/tools/cache/reflector.go:319 +0xf54
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch(0xc000198380, 0xc00021c120)
    /home/runner/go/pkg/mod/k8s.io/client-go@v0.22.2/tools/cache/reflector.go:360 +0x316
k8s.io/client-go/tools/cache.(*Reflector).Run.func1()
    /home/runner/go/pkg/mod/k8s.io/client-go@v0.22.2/tools/cache/reflector.go:221 +0x45
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc0001b9638)
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/wait/wait.go:155 +0x82
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0001b6140, {0x27dad40, 0xc00012e410}, 0x1, 0xc00021c120)
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/wait/wait.go:156 +0xcf
k8s.io/client-go/tools/cache.(*Reflector).Run(0xc000198380, 0xc00021c120)
    /home/runner/go/pkg/mod/k8s.io/client-go@v0.22.2/tools/cache/reflector.go:220 +0x2df
k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/wait/wait.go:56 +0x3f
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/wait/wait.go:73 +0x7f
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/wait/wait.go:71 +0xdf
panic: runtime error: invalid memory address or nil pointer dereference [recovered]
    panic: runtime error: invalid memory address or nil pointer dereference

goroutine 55 [running]:
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc000021738})
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/runtime/runtime.go:55 +0x145
panic({0x22ad360, 0x3694e00})
    /opt/hostedtoolcache/go/1.17.2/x64/src/runtime/panic.go:1047 +0x266
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch.func1(0xc000198380, 0xc00002c360, 0xc00021c120, 0xc000021988)
    /home/runner/go/pkg/mod/k8s.io/client-go@v0.22.2/tools/cache/reflector.go:319 +0xf54
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch(0xc000198380, 0xc00021c120)
    /home/runner/go/pkg/mod/k8s.io/client-go@v0.22.2/tools/cache/reflector.go:360 +0x316
k8s.io/client-go/tools/cache.(*Reflector).Run.func1()
    /home/runner/go/pkg/mod/k8s.io/client-go@v0.22.2/tools/cache/reflector.go:221 +0x45
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc0001b9638)
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/wait/wait.go:155 +0x82
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0001b6140, {0x27dad40, 0xc00012e410}, 0x1, 0xc00021c120)
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/wait/wait.go:156 +0xcf
k8s.io/client-go/tools/cache.(*Reflector).Run(0xc000198380, 0xc00021c120)
    /home/runner/go/pkg/mod/k8s.io/client-go@v0.22.2/tools/cache/reflector.go:220 +0x2df
k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/wait/wait.go:56 +0x3f
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/wait/wait.go:73 +0x7f
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
    /home/runner/go/pkg/mod/k8s.io/apimachinery@v0.22.2/pkg/util/wait/wait.go:71 +0xdf
FAIL    github.com/open-telemetry/opentelemetry-collector-contrib/receiver/k8sclusterreceiver   0.131s
skyduo commented 2 years ago

same error in the health check pr https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/5643

github-actions[bot] commented 1 year ago

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

github-actions[bot] commented 1 year ago

Pinging code owners: @dmitryax. See Adding Labels via Comments if you do not have permissions to add labels yourself.

github-actions[bot] commented 1 year ago

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

atoulme commented 8 months ago

Is this issue really worthy of being marked never stale? Can we close it as no reports have been made in over 2 years?