kubernetes / client-go

Go client for Kubernetes.
Apache License 2.0
8.79k stars 2.91k forks source link

invalid memory address or nil pointer dereference on go 1.20.1 #1229

Closed hakankoklu closed 5 months ago

hakankoklu commented 1 year ago

Upgrading to go 1.20.1 from 1.19.6, I am getting this error during testing

E0217 10:11:44.300188   21635 runtime.go:78] Observed a panic: "invalid memory address or nil pointer dereference" (runtime error: invalid memory address or nil pointer dereference)
goroutine 158 [running]:
k8s.io/apimachinery/pkg/util/runtime.logPanic({0x2c0ed00?, 0x4b09660?})
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/runtime/runtime.go:74 +0xf0
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc0001df8e8?})
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/runtime/runtime.go:48 +0xb0
panic({0x2c0ed00, 0x4b09660})
    /opt/go/src/runtime/panic.go:890 +0x263
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch.func1(0xc0001d27e0, 0xc0001d6630, 0xc000d7c5a0, 0xc0001dfb40)
    /code/.cache/gopath/pkg/mod/k8s.io/client-go@v0.23.0/tools/cache/reflector.go:319 +0x127a
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch(0xc0001d27e0, 0xc000d7c5a0)
    /code/.cache/gopath/pkg/mod/k8s.io/client-go@v0.23.0/tools/cache/reflector.go:361 +0x2f9
k8s.io/client-go/tools/cache.(*Reflector).Run.func1()
    /code/.cache/gopath/pkg/mod/k8s.io/client-go@v0.23.0/tools/cache/reflector.go:221 +0x45
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc001001ee0)
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/wait/wait.go:155 +0x49
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0001c21c0?, {0x3905c00, 0xc0006d4780}, 0x1, 0xc000d7c5a0)
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/wait/wait.go:156 +0xcf
k8s.io/client-go/tools/cache.(*Reflector).Run(0xc0001d27e0, 0xc000d7c5a0)
    /code/.cache/gopath/pkg/mod/k8s.io/client-go@v0.23.0/tools/cache/reflector.go:220 +0x2aa
k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/wait/wait.go:56 +0x3f
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/wait/wait.go:73 +0x74
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/wait/wait.go:71 +0xe5
panic: runtime error: invalid memory address or nil pointer dereference [recovered]
    panic: runtime error: invalid memory address or nil pointer dereference

goroutine 158 [running]:
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0xc0001df8e8?})
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/runtime/runtime.go:55 +0x145
panic({0x2c0ed00, 0x4b09660})
    /opt/go/src/runtime/panic.go:890 +0x263
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch.func1(0xc0001d27e0, 0xc0001d6630, 0xc000d7c5a0, 0xc0001dfb40)
    /code/.cache/gopath/pkg/mod/k8s.io/client-go@v0.23.0/tools/cache/reflector.go:319 +0x127a
k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch(0xc0001d27e0, 0xc000d7c5a0)
    /code/.cache/gopath/pkg/mod/k8s.io/client-go@v0.23.0/tools/cache/reflector.go:361 +0x2f9
k8s.io/client-go/tools/cache.(*Reflector).Run.func1()
    /code/.cache/gopath/pkg/mod/k8s.io/client-go@v0.23.0/tools/cache/reflector.go:221 +0x45
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc001001ee0)
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/wait/wait.go:155 +0x49
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0001c21c0?, {0x3905c00, 0xc0006d4780}, 0x1, 0xc000d7c5a0)
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/wait/wait.go:156 +0xcf
k8s.io/client-go/tools/cache.(*Reflector).Run(0xc0001d27e0, 0xc000d7c5a0)
    /code/.cache/gopath/pkg/mod/k8s.io/client-go@v0.23.0/tools/cache/reflector.go:220 +0x2aa
k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1()
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/wait/wait.go:56 +0x3f
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/wait/wait.go:73 +0x74
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
    /code/.cache/gopath/pkg/mod/k8s.io/apimachinery@v0.23.0/pkg/util/wait/wait.go:71 +0xe5

using client-go and apimachinery v0.23.0. go 1.19.6 works without a problem. The panic trace doesn't include anything from the tested code so not sure how to troubleshoot.

k8s-triage-robot commented 1 year ago

The Kubernetes project currently lacks enough contributors to adequately respond to all issues.

This bot triages un-triaged issues according to the following rules:

You can:

Please send feedback to sig-contributor-experience at kubernetes/community.

/lifecycle stale

k8s-triage-robot commented 1 year ago

The Kubernetes project currently lacks enough active contributors to adequately respond to all issues.

This bot triages un-triaged issues according to the following rules:

You can:

Please send feedback to sig-contributor-experience at kubernetes/community.

/lifecycle rotten

k8s-triage-robot commented 5 months ago

The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.

This bot triages issues according to the following rules:

You can:

Please send feedback to sig-contributor-experience at kubernetes/community.

/close not-planned

k8s-ci-robot commented 5 months ago

@k8s-triage-robot: Closing this issue, marking it as "Not Planned".

In response to [this](https://github.com/kubernetes/client-go/issues/1229#issuecomment-1899816166): >The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs. > >This bot triages issues according to the following rules: >- After 90d of inactivity, `lifecycle/stale` is applied >- After 30d of inactivity since `lifecycle/stale` was applied, `lifecycle/rotten` is applied >- After 30d of inactivity since `lifecycle/rotten` was applied, the issue is closed > >You can: >- Reopen this issue with `/reopen` >- Mark this issue as fresh with `/remove-lifecycle rotten` >- Offer to help out with [Issue Triage][1] > >Please send feedback to sig-contributor-experience at [kubernetes/community](https://github.com/kubernetes/community). > >/close not-planned > >[1]: https://www.kubernetes.dev/docs/guide/issue-triage/ Instructions for interacting with me using PR comments are available [here](https://git.k8s.io/community/contributors/guide/pull-requests.md). If you have questions or suggestions related to my behavior, please file an issue against the [kubernetes/test-infra](https://github.com/kubernetes/test-infra/issues/new?title=Prow%20issue:) repository.