Closed agardnerIT closed 1 year ago
Updating helm chart to v0.2.35
causes panic in open-feature-operator-controller-manager
:
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0x1302f10]
goroutine 1 [running]:
k8s.io/client-go/discovery.convertAPIResource(...)
/go/pkg/mod/k8s.io/client-go@v0.26.3/discovery/aggregated_discovery.go:114
k8s.io/client-go/discovery.convertAPIGroup({{{0x0, 0x0}, {0x0, 0x0}}, {{0xc0000e3d28, 0x15}, {0x0, 0x0}, {0x0, 0x0}, ...}, ...})
/go/pkg/mod/k8s.io/client-go@v0.26.3/discovery/aggregated_discovery.go:95 +0x6f0
k8s.io/client-go/discovery.SplitGroupsAndResources({{{0xc0000e20d8, 0x15}, {0xc0000f0fa0, 0x1b}}, {{0x0, 0x0}, {0x0, 0x0}, {0x0, 0x0}, ...}, ...})
/go/pkg/mod/k8s.io/client-go@v0.26.3/discovery/aggregated_discovery.go:49 +0x125
k8s.io/client-go/discovery.(*DiscoveryClient).downloadAPIs(0xc000398878?)
/go/pkg/mod/k8s.io/client-go@v0.26.3/discovery/discovery_client.go:328 +0x3de
k8s.io/client-go/discovery.(*DiscoveryClient).GroupsAndMaybeResources(0xc000398ca8?)
/go/pkg/mod/k8s.io/client-go@v0.26.3/discovery/discovery_client.go:203 +0x65
k8s.io/client-go/discovery.ServerGroupsAndResources({0x1aefcf8, 0xc000279d70})
/go/pkg/mod/k8s.io/client-go@v0.26.3/discovery/discovery_client.go:413 +0x59
k8s.io/client-go/discovery.(*DiscoveryClient).ServerGroupsAndResources.func1()
/go/pkg/mod/k8s.io/client-go@v0.26.3/discovery/discovery_client.go:376 +0x25
k8s.io/client-go/discovery.withRetries(0x2, 0xc000440cc0)
/go/pkg/mod/k8s.io/client-go@v0.26.3/discovery/discovery_client.go:651 +0x71
k8s.io/client-go/discovery.(*DiscoveryClient).ServerGroupsAndResources(0x0?)
/go/pkg/mod/k8s.io/client-go@v0.26.3/discovery/discovery_client.go:375 +0x3a
k8s.io/client-go/restmapper.GetAPIGroupResources({0x1aefcf8?, 0xc000279d70?})
/go/pkg/mod/k8s.io/client-go@v0.26.3/restmapper/discovery.go:148 +0x42
sigs.k8s.io/controller-runtime/pkg/client/apiutil.NewDynamicRESTMapper.func1()
/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.14.6/pkg/client/apiutil/dynamicrestmapper.go:94 +0x25
sigs.k8s.io/controller-runtime/pkg/client/apiutil.(*dynamicRESTMapper).setStaticMapper(...)
/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.14.6/pkg/client/apiutil/dynamicrestmapper.go:130
sigs.k8s.io/controller-runtime/pkg/client/apiutil.NewDynamicRESTMapper(0xc0000f8120?, {0x0, 0x0, 0x47f4d7c94faa9501?})
/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.14.6/pkg/client/apiutil/dynamicrestmapper.go:110 +0x182
sigs.k8s.io/controller-runtime/pkg/cluster.setOptionsDefaults.func1(0xc0004ff260?)
/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.14.6/pkg/cluster/cluster.go:217 +0x25
sigs.k8s.io/controller-runtime/pkg/cluster.New(0xc0002ea000, {0xc000399660, 0x1, 0x0?})
/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.14.6/pkg/cluster/cluster.go:159 +0x18d
sigs.k8s.io/controller-runtime/pkg/manager.New(_, {0xc0004ff260, 0x0, 0x0, {{0x1aec038, 0xc00041f540}, 0x0}, 0x1, {0x0, 0x0}, ...})
/go/pkg/mod/sigs.k8s.io/controller-runtime@v0.14.6/pkg/manager/manager.go:351 +0xf9
main.main()
/workspace/main.go:173 +0x1275
Furthermore, this causes the demo app to crash. It seems that using gRPC there is a hard dependancy between the demo app and flagd:
Demo app logs:
Connecting to flagD at localhost:8030
Example app listening on port 3000
/usr/src/app/node_modules/@openfeature/flagd-provider/index.cjs:1533
reject(new Error(errorMessage));
^
Error: FlagdProvider: max stream connect attempts (5 reached)
at GRPCService.handleError (/usr/src/app/node_modules/@openfeature/flagd-provider/index.cjs:1533:20)
at /usr/src/app/node_modules/@openfeature/flagd-provider/index.cjs:1476:22
at /usr/src/app/node_modules/@protobuf-ts/runtime-rpc/build/commonjs/rpc-output-stream.js:86:36
at Array.forEach (<anonymous>)
at RpcOutputStreamController.notifyError (/usr/src/app/node_modules/@protobuf-ts/runtime-rpc/build/commonjs/rpc-output-stream.js:86:23)
at ClientReadableStreamImpl.<anonymous> (/usr/src/app/node_modules/@protobuf-ts/grpc-transport/build/commonjs/grpc-transport.js:90:27)
at ClientReadableStreamImpl.emit (node:events:513:28)
at Object.onReceiveStatus (/usr/src/app/node_modules/@grpc/grpc-js/build/src/client.js:351:28)
at Object.onReceiveStatus (/usr/src/app/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:323:181)
at /usr/src/app/node_modules/@grpc/grpc-js/build/src/resolving-call.js:94:78
Raised an issue in the OFO repo: https://github.com/open-feature/open-feature-operator/issues/515
Could you please test it with Kind and K8s v. 1.26? Just to double check before releasing a new flagd and ofo version?
I've tested this and bumping OFO to v0.2.36 solves this issue. Thanks!