fabric8io / kubernetes-client

Java client for Kubernetes & OpenShift
http://fabric8.io
Apache License 2.0
3.42k stars 1.46k forks source link

NoSuchMethodError on ConfigBuilder methods for primitives #6249

Closed bearpaws closed 1 month ago

bearpaws commented 3 months ago

Describe the bug

After upgrading from 6.13.1 to 6.13.2, this produces an exception:

int max = 512;
new ConfigBuilder(Config.autoConfigure(null))
            .withMaxConcurrentRequests(max)
            .build()
java.lang.NoSuchMethodError: 'io.fabric8.kubernetes.client.ConfigFluent io.fabric8.kubernetes.client.ConfigBuilder.withMaxConcurrentRequests(int)'

In this commit all Config methods were changed to accept object instead of primitive: https://github.com/fabric8io/kubernetes-client/commit/08b0e9f301fb163d9b98f71cd74fdf132ae7625b

It seems sundrio is not supporting both (possibly related to https://github.com/sundrio/sundrio/issues/387).

Fabric8 Kubernetes Client version

6.13.2

Steps to reproduce

See code above.

Expected behavior

Primitives should still be supported.

Runtime

Kubernetes (vanilla)

Kubernetes API Server version

1.25.3@latest

Environment

Linux

Fabric8 Kubernetes Client Logs

No response

Additional context

No response

rohanKanojia commented 3 months ago

@bearpaws : Yes, this change was deliberately done to differentiate between user-configured values coming from the ConfigBuilder and default values initialized in the initial new ConfigBuilder call.

Is it not possible in your codebase to make these declarations boxed?

bearpaws commented 3 months ago

@rohanKanojia This is a third-party library, so in this case it's not possible for me to change the code.

It is concerning, though, that this is not a compile-time error but a runtime error. Why doesn't auto-boxing work in this case?

Was this an expected break in a patch release?

rohanKanojia commented 3 months ago

@bearpaws : I tried running the abovementioned code in one of my projects using v6.13.2 but I didn't get any error. Could you please share some more detailed steps on how you can reproduce it? Or maybe share a reproducer project.

Was this an expected break in a patch release

We thought this change would be transparent to users due to auto-boxing.

manusa commented 3 months ago

It looks more like you might have conflicting versions of the client, can you provide more details about the library you're using ("This is a third-party library").

Can you try to use dependencyManagement in your project to ensure everything is using 6.13.2 by importing the pom BOM:

<dependencyManagement>
  <dependencies>
    <dependency>
      <groupId>io.fabric8</groupId>
      <artifactId>kubernetes-client-bom</artifactId>
      <version>6.13.2</version>
      <scope>import</scope>
      <type>pom</type>
    </dependency>
metacosm commented 3 months ago

I'm seeing similar issues and I do use the BOM so I don't think this is a dependency problem.

manusa commented 3 months ago

I'm seeing similar issues and I do use the BOM so I don't think this is a dependency problem.

If the library @bearpaws is using is Quarkus, then it might be the same case as yours. If his library is another one, then it's worth a shot.

metacosm commented 3 months ago

For more context, I see the exact same issue when trying to build the Quarkus extension for the Java Operator SDK with 6.13.2 while the Java Operator SDK itself works just fine with 6.13.2. Are you using the same setup, @bearpaws (the example you gave seems to indicate so).

manusa commented 3 months ago

I released v6.13.3.

Hopefully this should cover the issue regardless of the project setup and any convergence problem with 6.13.x dependencies.

bearpaws commented 3 months ago

@manusa Thanks, the fix seems to be working now.

@metacosm I'm using the Java Operator SDK, no Quarkus.

metacosm commented 3 months ago

@bearpaws would you mind detailing your setup a little because it seems that the issue was only manifesting itself when using JOSDK with Quarkus?

bearpaws commented 3 months ago

@metacosm Here is the full stack trace:

 java.lang.NoSuchMethodError: 'io.fabric8.kubernetes.client.ConfigFluent io.fabric8.kubernetes.client.ConfigBuilder.withMaxConcurrentRequests(int)'
    at io.javaoperatorsdk.operator.api.config.ConfigurationService.getKubernetesClient(ConfigurationService.java:94)
    at io.javaoperatorsdk.operator.api.config.AbstractConfigurationService.getKubernetesClient(AbstractConfigurationService.java:159)
    at io.javaoperatorsdk.operator.api.config.ConfigurationService$1.clone(ConfigurationService.java:60)
    at io.javaoperatorsdk.operator.processing.event.source.informer.InformerManager.lambda$get$11(InformerManager.java:173)
    at java.base/java.util.Optional.map(Optional.java:260)
    at io.javaoperatorsdk.operator.processing.event.source.informer.InformerManager.get(InformerManager.java:173)
    at io.javaoperatorsdk.operator.processing.event.source.informer.ManagedInformerEventSource.get(ManagedInformerEventSource.java:121)
    at io.javaoperatorsdk.operator.processing.event.EventProcessor.submitReconciliationExecution(EventProcessor.java:127)
    at io.javaoperatorsdk.operator.processing.event.EventProcessor.handleMarkedEventForResource(EventProcessor.java:119)
    at io.javaoperatorsdk.operator.processing.event.EventProcessor.handleAlreadyMarkedEvents(EventProcessor.java:415)
    at io.javaoperatorsdk.operator.processing.event.EventProcessor.start(EventProcessor.java:409)
    at io.javaoperatorsdk.operator.processing.Controller.start(Controller.java:346)
    at io.javaoperatorsdk.operator.ControllerManager.lambda$start$0(ControllerManager.java:45)
    at io.javaoperatorsdk.operator.api.config.ExecutorServiceManager.lambda$executeAndWaitForAllToComplete$0(ExecutorServiceManager.java:61)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    at java.base/java.lang.Thread.run(Thread.java:840)

The operator is started via:

ConfigurationService configSvc = new DefaultConfigurationService();
Operator operator = new Operator(overrider -> overrider.withKubernetesClient(kubeClient));
MyReconciler reconciler = new MyReconciler(kubeClient, config);
ControllerConfiguration<My> controllerConfig = controllerConfig(configSvc, config.namespaces(), reconciler);

operator.register(reconciler, controllerConfig);
operator.start();

JOSDK 4.9.1

metacosm commented 3 months ago

Can you provide the output of mvn dependency:tree (or equivalent), please? Trying to rule out some mismatched client versions being present concurrently.

bearpaws commented 3 months ago
[INFO] +- io.javaoperatorsdk:operator-framework:jar:4.9.1:compile
[INFO] |  +- io.javaoperatorsdk:operator-framework-core:jar:4.9.1:compile
[INFO] |  +- io.fabric8:kubernetes-httpclient-okhttp:jar:6.13.2:compile
[INFO] |  |  +- com.squareup.okhttp3:okhttp:jar:4.12.0:compile
[INFO] |  |  |  +- com.squareup.okio:okio:jar:3.6.0:compile
[INFO] |  |  |  |  \- com.squareup.okio:okio-jvm:jar:3.6.0:compile
[INFO] |  |  |  |     \- org.jetbrains.kotlin:kotlin-stdlib-common:jar:1.9.10:compile
[INFO] |  |  |  \- org.jetbrains.kotlin:kotlin-stdlib-jdk8:jar:1.8.21:compile
[INFO] |  |  |     +- org.jetbrains.kotlin:kotlin-stdlib:jar:1.8.21:compile
[INFO] |  |  |     |  \- org.jetbrains:annotations:jar:13.0:compile
[INFO] |  |  |     \- org.jetbrains.kotlin:kotlin-stdlib-jdk7:jar:1.8.21:compile
[INFO] |  |  \- com.squareup.okhttp3:logging-interceptor:jar:3.12.12:compile
[INFO] |  \- com.squareup:javapoet:jar:1.13.0:compile
[INFO] +- io.fabric8:kubernetes-client:jar:6.13.2:compile
[INFO] |  +- io.fabric8:kubernetes-client-api:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-core:jar:6.13.2:compile
[INFO] |  |  |  \- io.fabric8:kubernetes-model-common:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-gatewayapi:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-resource:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-rbac:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-admissionregistration:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-apps:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-autoscaling:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-apiextensions:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-batch:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-certificates:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-coordination:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-discovery:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-events:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-extensions:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-flowcontrol:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-networking:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-metrics:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-policy:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-scheduling:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-storageclass:jar:6.13.2:compile
[INFO] |  |  +- io.fabric8:kubernetes-model-node:jar:6.13.2:compile
[INFO] |  |  +- org.snakeyaml:snakeyaml-engine:jar:2.7:compile
[INFO] |  |  +- com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:jar:2.17.1:compile
[INFO] |  |  |  \- org.yaml:snakeyaml:jar:2.2:compile
[INFO] |  |  \- com.fasterxml.jackson.datatype:jackson-datatype-jsr310:jar:2.17.1:compile
[INFO] |  \- io.fabric8:zjsonpatch:jar:0.3.0:compile
[INFO] +- io.fabric8:kubernetes-server-mock:jar:6.13.2:test
[INFO] |  +- io.fabric8:mockwebserver:jar:6.13.2:test
[INFO] |  |  \- com.squareup.okhttp3:mockwebserver:jar:4.12.0:test
[INFO] |  \- io.fabric8:servicecatalog-client:jar:6.13.2:test
[INFO] |     \- io.fabric8:servicecatalog-model:jar:6.13.2:test

We use the bom too.

      <dependency>
        <groupId>io.fabric8</groupId>
        <artifactId>kubernetes-client-bom</artifactId>
        <version>6.13.2</version>
        <type>pom</type>
        <scope>import</scope>
      </dependency>
metacosm commented 3 months ago

Hmm, OK, nothing seems out of the ordinary there… this issue is really weird. 😮

mjurc commented 3 months ago

I'm seeing the similar issues between 6.13.1 and 6.13.3 managed by Quarkus, e.g.

java.lang.NoSuchMethodError: 'io.fabric8.kubernetes.client.ConfigFluent io.fabric8.openshift.client.OpenShiftConfigBuilder.withNamespace(java.lang.String)'
    at io.quarkus.test.bootstrap.inject.OpenShiftClient.<init>(OpenShiftClient.java:119)
    at io.quarkus.test.bootstrap.inject.OpenShiftClient.create(OpenShiftClient.java:137)
    at io.quarkus.test.bootstrap.OpenShiftExtensionBootstrap.beforeAll(OpenShiftExtensionBootstrap.java:45)
    at io.quarkus.test.bootstrap.QuarkusScenarioBootstrap.lambda$beforeAll$0(QuarkusScenarioBootstrap.java:61)
    at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
    at io.quarkus.test.bootstrap.QuarkusScenarioBootstrap.beforeAll(QuarkusScenarioBootstrap.java:61)
    at io.quarkus.test.bootstrap.QuarkusScenarioBootstrap.beforeAll(QuarkusScenarioBootstrap.java:50)
    at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
    Suppressed: java.lang.NullPointerException: Cannot invoke "io.quarkus.test.bootstrap.inject.OpenShiftClient.deleteProject()" because "this.client" is null
        at io.quarkus.test.bootstrap.OpenShiftExtensionBootstrap.afterAll(OpenShiftExtensionBootstrap.java:52)
        at io.quarkus.test.bootstrap.QuarkusScenarioBootstrap.lambda$afterAll$2(QuarkusScenarioBootstrap.java:88)
        at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
        at io.quarkus.test.bootstrap.QuarkusScenarioBootstrap.afterAll(QuarkusScenarioBootstrap.java:88)
        at io.quarkus.test.bootstrap.QuarkusScenarioBootstrap.afterAll(QuarkusScenarioBootstrap.java:78)
        ... 1 more

I've opened issue for this in Quarkus repo too - see https://github.com/quarkusio/quarkus/issues/42656

rohanKanojia commented 3 months ago

@mjurc : Could you please check with dependency:tree whether issue still persists with kubernetes-client v6.13.3 ?

Your error looks slightly different than the one reported initially. This issue was reported for ConfigBuilder no longer having methods for primitive types. In your stacktrace, I see NoSuchMethodError for withNamespace(String). We haven't changed this method in v6.13.1

mjurc commented 3 months ago

@rohanKanojia I think the issue doesn't show up when the library is compiled with 6.13.3 and ran with 6.13.3, but there's binary incompatibility between 6.13.1 and 6.13.3 resulting from the fact that the ConfigFluent moved: 6.13.1: io.fabric8.kubernetes.client.ConfigFluent 6.13.3: io.fabric8.kubernetes.client.SundrioConfigFluent

If I compile and run with Quarkus 3.14.0.CR1, the issue disappears, but this is a message I get when I use our lib compiled with Quarkus 3.13.2 using fabric8 client 6.13.1 and run it with Quarkus 3.14.0.CR1 with fabric8 client 6.13.3

manusa commented 3 months ago

https://github.com/quarkusio/quarkus/issues/42656#issuecomment-2299277899

We might consider adding an intermediate ConfigFluent class to fix the binary compatibility issue.

@mjurc do you have an easy way to reproduce the issue (maybe a simple reproducer project that fails with the mentioned statcktrace) that can be used to verify if adding the intermediate class solves the problem?

mjurc commented 2 months ago

@manusa I've created a very simple reproducer:

git clone git@github.com:mjurc/playground.git && cd playground/fabric8-binary-compatibility
# backwards binary compatibility
mvn clean install -Dfabric8.openshift-client.build-version=6.13.1 -Dfabric8.openshift-client.runtime-version=6.13.3
# forwards binary compatibility
mvn clean install -Dfabric8.openshift-client.build-version=6.13.3 -Dfabric8.openshift-client.runtime-version=6.13.1
# pass
mvn clean install -Dfabric8.openshift-client.build-version=6.13.3 -Dfabric8.openshift-client.runtime-version=6.13.3
manusa commented 2 months ago

@manusa I've created a very simple reproducer:

Hi @mjurc

Awesome, thanks :heart:!

After discussion in Quarkus Zulip (https://quarkusio.zulipchat.com/#narrow/stream/187038-dev/topic/quickstarts/near/465419435), my understanding that a "fix" is no longer needed.

As far as I see it right now, creating a 6.13.4 will either reintroduce a binary-compatibility issue with 6.13.3 or will create excessive boilerplate to avoid it. If you'd still need a fix, please say so, otherwise I don't think it's worth the effort at the moment.

mjurc commented 2 months ago

Hi, I think the issue has been clarified from the Quarkus side and I agree that right now, this particular binary incompatibility doesn't warrant a fix.