Open BogdanAlexa1234 opened 3 weeks ago
/cc @pedroigor (oidc), @sberyozkin (oidc)
I also tried with 999-SNAPSHOT
version, last commit from the reproducer repo uses that version and it fails with the same error.
@BogdanAlexa1234 I'll try to have a look tomorrow
@mkouba @manovotn Hi,
This code, https://github.com/quarkusio/quarkus/blob/main/extensions/oidc/deployment/src/main/java/io/quarkus/oidc/deployment/devservices/keycloak/KeycloakDevUIProcessor.java#L42, is indeed repeatedly called, if quarkus-oidc
and quarkus-opentelemetry
extensions are used, and quarkus.otel.logs.enabled=true
is configured in application.properties
...
KeycloakDevUIProcessor
produces SyntheticBeanBuildItem
here and CardPageBuildItem
here.
Can you please have a quick look, is there something that we can do at the KeycloakDevUIProcessor
level to avoid the recursion, or can it be a higher level issue ?
Also CC @brunobat, Hi Bruno, can you please link to the deployment code which deals with quarkus.otel.logs.enabled=true
? For some reasons it triggers this recursion
Can you please have a quick look, is there something that we can do at the
KeycloakDevUIProcessor
level to avoid the recursion, or can it be a higher level issue ?
Well, from the ChainBuildException
message it's obvious that LogHandlerProcessor#build()
(which is only enabled if quarkus.otel.logs.enabled=true
) consumes the BeanContainerBuildItem
and thus requires the CDI container to be ready which is in conflict with KeycloakDevUIProcessor#produceProviderComponent()
that produces a synthetic bean.
I'm not an expert neither in Keycloak nor in OTel so I cannot really say if it's possible to break the cycle easily...
That seems to be the case. LogHandlerProcessor
must have a CDI container to retrieve OTel:
https://github.com/quarkusio/quarkus/blob/42e6c329f61afbe05be80af95aa18a1ef2781d2e/extensions/opentelemetry/runtime/src/main/java/io/quarkus/opentelemetry/runtime/logs/OpenTelemetryLogRecorder.java#L19
And AbstractDevUIProcessor is creating a bean here: https://github.com/quarkusio/quarkus/blob/a8f21be8fdae0ff284d2410e9eb97c38a5f5c696/extensions/oidc/deployment/src/main/java/io/quarkus/oidc/deployment/devservices/AbstractDevUIProcessor.java#L73
It is very strange that something like this hasn't happen before because of this synthetic bean creation in the AbstractDevUIProcessor
.
Any extension requiring beans instance at runtime init might get this issue...
The important part:
to io.quarkus.opentelemetry.deployment.logging.LogHandlerProcessor#build produced class io.quarkus.deployment.builditem.LogHandlerBuildItem
to io.quarkus.deployment.logging.LoggingResourceProcessor#setupLoggingRuntimeInit produced class io.quarkus.deployment.logging.LoggingSetupBuildItem
to io.quarkus.devservices.keycloak.KeycloakDevServicesProcessor#startKeycloakContainer produced class io.quarkus.devservices.keycloak.KeycloakDevServicesConfigBuildItem
to io.quarkus.oidc.deployment.devservices.keycloak.KeycloakDevUIProcessor#produceProviderComponent produced class io.quarkus.arc.deployment.SyntheticBeanBuildItem
to io.quarkus.arc.deployment.SyntheticBeansProcessor#initRuntime produced class io.quarkus.arc.deployment.BeanRegistrationPhaseBuildItem$BeanConfiguratorBuildItem
@mkouba, @brunobat,
The reason KeycloakDevUIProcessor
uses a synthetic build item is to support producing JSONRpcProviderBuildItem
here, for the OidcDevJsonRpcService.class
be able to retrieve the required properties here - there is no other way to pass properties to it but indirectly via Arc.
I'd also like CC @phillip-kruger.
OpenTelemetry openTelemetry = beanContainer.beanInstance(OpenTelemetry.class);
What I'm not sure about, why exactly KeycloakDevServicesProcessor#startKeycloakContainer
which is impacted by it, there are so many other build steps everywhere which produce synthetic build items...
I'm surprised this one doesn't get more attention as it makes OTel Logging mostly unusable for anyone using OIDC - which will concern quite a lot of people out there.
For me, there are two alternatives:
I'm not sure if one of this is better or even doable but the current situation is not something we can live with.
@mkouba I think your expertise on the first point would be interesting @phillip-kruger and your perspective on the second would be valuable
I can't say I was able to walk through it all but what I see from the code is that LogHandlerProcessor#build()
does not have to consume the BeanContainerBuildItem
because (a) it's @Record(ExecutionTime.RUNTIME_INIT)
and the CDI container is always running at this point, and (b) the OpenTelemetryLogRecorder#initializeHandler()
can simply replace beanContainer.beanInstance(OpenTelemetry.class)
with Arc.container(). instance(OpenTelemetry.class).get()
. This could help to break the cycle...
@mkouba and we're sure it won't pave the way for weird race conditions? That's what I feared.
I'll have a look and prepare a patch with what you're suggesting and make sure it fixes the problem and then we can discuss how safe it is.
I created https://github.com/quarkusio/quarkus/pull/44557 with Martin's suggestion.
@mkouba and we're sure it won't pave the way for weird race conditions? That's what I feared.
I can't say because I don't know much about this domain. But it's should be functionally equivalent, i.e. consuming BeanContainerBuildItem
does not make any difference.
I'll have a look and prepare a patch with what you're suggesting and make sure it fixes the problem and then we can discuss how safe it is.
👍
https://github.com/quarkusio/quarkus/pull/44557, from the CI failures will not work because OTel is a Synthetic bean and it dosen't seem yet initialised when Arc is used. This could work but would need a delegate that would set OTel once available... However, log lines could be lost in the meantime.
Yeah, I'm not sure how to solve this.
We could maybe discuss with @phillip-kruger if we could come up with a way to produce properties for the Dev UI outside of beans but I wonder if we wouldn't get hit by that in the future.
We actually already have a way to delay the logging until the config is properly initialized but my guess is that it might not be easily be delayed more until after the CDI initialization.
But TBH, I don't see a lot more options. Because even if we could avoid creating beans for passing Dev UI properties, I'm pretty sure we will again get hit by this issue in other circumstances.
/cc @geoand @dmlloyd @cescoffier @maxandersen
Yeah, I'm not sure how to solve this.
We could maybe discuss with @phillip-kruger if we could come up with a way to produce properties for the Dev UI outside of beans but I wonder if we wouldn't get hit by that in the future.
We actually already have a way to delay the logging until the config is properly initialized but my guess is that it might not be easily be delayed more until after the CDI initialization.
But TBH, I don't see a lot more options. Because even if we could avoid creating beans for passing Dev UI properties, I'm pretty sure we will again get hit by this issue in other circumstances.
True, and probably it might be happening elsewhere already but we are not seeing it because OTel logging is off by default.
I am busy looking at this. Seems like most of the things currently being exposed in the Json RPC service (for Dev UI) in OIDC can be moved to a BuildTimeAction. This seems to fix the problem with Dev UI, but I still get CIRCULAR REFERENCE
error. I'
ll do a PR soon with the OIDC changes, but I am not sure this is the root problem. (You can also test this out by just removing all Dev UI from OIDC, you should get the same CIRCULAR REFERENCE error)
I have opened #44590 to change the way that data is made available in the OIDC Dev UI Json-RPC Service. There are various ways that this can be done, I took the approach with the smallest change. So the data is still being served with a Json-RPC Service, but the Json-RPC service is not looking up a bean, the data is directly set on the service with a recorder. Another option would be to use BuildTimeData, that can also take a RuntimeValue, but this is a bigger change. (I did this initially, but the end result is the same).
This unfortunately does not fix this issue, it just takes Dev UI out of the picture. The error we get currently:
2024-11-20 13:00:29,903 ERROR [io.qua.dep.dev.IsolatedDevModeMain] (main) Failed to start quarkus: java.lang.RuntimeException: io.quarkus.builder.ChainBuildException: Cycle detected:
io.quarkus.arc.deployment.ArcProcessor#signalBeanContainerReady produced class io.quarkus.arc.deployment.BeanContainerBuildItem
to io.quarkus.opentelemetry.deployment.logging.LogHandlerProcessor#build produced class io.quarkus.deployment.builditem.LogHandlerBuildItem
to io.quarkus.deployment.logging.LoggingResourceProcessor#setupLoggingRuntimeInit produced class io.quarkus.deployment.logging.LoggingSetupBuildItem
to io.quarkus.devservices.keycloak.KeycloakDevServicesProcessor#startKeycloakContainer produced class io.quarkus.devservices.keycloak.KeycloakDevServicesConfigBuildItem
to io.quarkus.oidc.deployment.devservices.keycloak.KeycloakDevUIProcessor#produceProviderComponent produced class io.quarkus.arc.deployment.SyntheticBeanBuildItem
to io.quarkus.arc.deployment.SyntheticBeansProcessor#initRegular produced class io.quarkus.arc.deployment.BeanRegistrationPhaseBuildItem$BeanConfiguratorBuildItem
to io.quarkus.arc.deployment.ArcProcessor#registerSyntheticObservers produced class io.quarkus.arc.deployment.ValidationPhaseBuildItem$ValidationErrorBuildItem
to io.quarkus.arc.deployment.ArcProcessor#generateResources produced class io.quarkus.arc.deployment.ResourcesGeneratedPhaseBuildItem
to io.quarkus.arc.deployment.ArcProcessor#initializeContainer produced class io.quarkus.arc.deployment.ArcContainerBuildItem
to io.quarkus.arc.deployment.ArcProcessor#notifyBeanContainerListeners produced class io.quarkus.arc.deployment.PreBeanContainerBuildItem
to io.quarkus.arc.deployment.ArcProcessor#signalBeanContainerReady
at io.quarkus.runner.bootstrap.AugmentActionImpl.runAugment(AugmentActionImpl.java:355)
at io.quarkus.runner.bootstrap.AugmentActionImpl.createInitialRuntimeApplication(AugmentActionImpl.java:272)
at io.quarkus.runner.bootstrap.AugmentActionImpl.createInitialRuntimeApplication(AugmentActionImpl.java:62)
at io.quarkus.deployment.dev.IsolatedDevModeMain.firstStart(IsolatedDevModeMain.java:89)
at io.quarkus.deployment.dev.IsolatedDevModeMain.accept(IsolatedDevModeMain.java:428)
at io.quarkus.deployment.dev.IsolatedDevModeMain.accept(IsolatedDevModeMain.java:55)
at io.quarkus.bootstrap.app.CuratedApplication.runInCl(CuratedApplication.java:138)
at io.quarkus.bootstrap.app.CuratedApplication.runInAugmentClassLoader(CuratedApplication.java:93)
at io.quarkus.deployment.dev.DevModeMain.start(DevModeMain.java:131)
at io.quarkus.deployment.dev.DevModeMain.main(DevModeMain.java:62)
Caused by: io.quarkus.builder.ChainBuildException: Cycle detected:
io.quarkus.arc.deployment.ArcProcessor#signalBeanContainerReady produced class io.quarkus.arc.deployment.BeanContainerBuildItem
to io.quarkus.opentelemetry.deployment.logging.LogHandlerProcessor#build produced class io.quarkus.deployment.builditem.LogHandlerBuildItem
to io.quarkus.deployment.logging.LoggingResourceProcessor#setupLoggingRuntimeInit produced class io.quarkus.deployment.logging.LoggingSetupBuildItem
to io.quarkus.devservices.keycloak.KeycloakDevServicesProcessor#startKeycloakContainer produced class io.quarkus.devservices.keycloak.KeycloakDevServicesConfigBuildItem
to io.quarkus.oidc.deployment.devservices.keycloak.KeycloakDevUIProcessor#produceProviderComponent produced class io.quarkus.arc.deployment.SyntheticBeanBuildItem
to io.quarkus.arc.deployment.SyntheticBeansProcessor#initRegular produced class io.quarkus.arc.deployment.BeanRegistrationPhaseBuildItem$BeanConfiguratorBuildItem
to io.quarkus.arc.deployment.ArcProcessor#registerSyntheticObservers produced class io.quarkus.arc.deployment.ValidationPhaseBuildItem$ValidationErrorBuildItem
to io.quarkus.arc.deployment.ArcProcessor#generateResources produced class io.quarkus.arc.deployment.ResourcesGeneratedPhaseBuildItem
to io.quarkus.arc.deployment.ArcProcessor#initializeContainer produced class io.quarkus.arc.deployment.ArcContainerBuildItem
to io.quarkus.arc.deployment.ArcProcessor#notifyBeanContainerListeners produced class io.quarkus.arc.deployment.PreBeanContainerBuildItem
to io.quarkus.arc.deployment.ArcProcessor#signalBeanContainerReady
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:327)
at io.quarkus.builder.BuildChainBuilder.cycleCheckProduce(BuildChainBuilder.java:300)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:331)
at io.quarkus.builder.BuildChainBuilder.cycleCheckProduce(BuildChainBuilder.java:300)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:331)
at io.quarkus.builder.BuildChainBuilder.cycleCheckProduce(BuildChainBuilder.java:300)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:331)
at io.quarkus.builder.BuildChainBuilder.cycleCheckProduce(BuildChainBuilder.java:300)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:331)
at io.quarkus.builder.BuildChainBuilder.cycleCheckProduce(BuildChainBuilder.java:300)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:331)
at io.quarkus.builder.BuildChainBuilder.cycleCheckProduce(BuildChainBuilder.java:300)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:331)
at io.quarkus.builder.BuildChainBuilder.cycleCheckProduce(BuildChainBuilder.java:300)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:331)
at io.quarkus.builder.BuildChainBuilder.cycleCheckProduce(BuildChainBuilder.java:300)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:331)
at io.quarkus.builder.BuildChainBuilder.cycleCheckProduce(BuildChainBuilder.java:300)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:331)
at io.quarkus.builder.BuildChainBuilder.cycleCheckProduce(BuildChainBuilder.java:300)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:331)
at io.quarkus.builder.BuildChainBuilder.cycleCheckProduce(BuildChainBuilder.java:300)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:331)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:291)
at io.quarkus.builder.BuildChainBuilder.detectCycles(BuildChainBuilder.java:284)
at io.quarkus.builder.BuildChainBuilder.build(BuildChainBuilder.java:157)
at io.quarkus.deployment.QuarkusAugmentor.run(QuarkusAugmentor.java:143)
at io.quarkus.runner.bootstrap.AugmentActionImpl.runAugment(AugmentActionImpl.java:351)
... 9 more
With the changes in my PR, there is no more Dev UI in the stacktrace, however we now get this:
2024-11-20 12:45:12,256 WARNING [io.qua.ope.run.exp.otl.sen.VertxGrpcSender] (vert.x-eventloop-thread-7) Failed to export LogsRequestMarshalers. The request could not be executed. Full error message: Connection refused: localhost/127.0.0.1:4317
2024-11-20 12:45:12,258 SEVERE [io.ope.exp.int.grp.GrpcExporter] (vert.x-eventloop-thread-7) Failed to export logs. The request could not be executed. Error message: Connection refused: localhost/127.0.0.1:4317 [Error Occurred After Shutdown]: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: localhost/127.0.0.1:4317
Suppressed: java.lang.IllegalStateException: Retries exhausted: 3/3
at io.smallrye.mutiny.helpers.ExponentialBackoff$1.lambda$apply$0(ExponentialBackoff.java:46)
at io.smallrye.context.impl.wrappers.SlowContextualFunction.apply(SlowContextualFunction.java:21)
at io.smallrye.mutiny.groups.MultiOnItem.lambda$transformToUni$6(MultiOnItem.java:268)
at io.smallrye.mutiny.operators.multi.MultiConcatMapOp$MainSubscriber.onItem(MultiConcatMapOp.java:117)
at io.smallrye.mutiny.subscription.MultiSubscriber.onNext(MultiSubscriber.java:61)
at io.smallrye.mutiny.operators.multi.processors.UnicastProcessor.drainWithDownstream(UnicastProcessor.java:108)
at io.smallrye.mutiny.operators.multi.processors.UnicastProcessor.drain(UnicastProcessor.java:139)
at io.smallrye.mutiny.operators.multi.processors.UnicastProcessor.onNext(UnicastProcessor.java:205)
at io.smallrye.mutiny.operators.multi.processors.SerializedProcessor.onNext(SerializedProcessor.java:104)
at io.smallrye.mutiny.subscription.SerializedSubscriber.onItem(SerializedSubscriber.java:74)
at io.smallrye.mutiny.subscription.MultiSubscriber.onNext(MultiSubscriber.java:61)
at io.smallrye.mutiny.operators.multi.MultiRetryWhenOp$RetryWhenOperator.onFailure(MultiRetryWhenOp.java:127)
at io.smallrye.mutiny.subscription.MultiSubscriber.onError(MultiSubscriber.java:73)
at io.smallrye.mutiny.converters.uni.UniToMultiPublisher$UniToMultiSubscription.onFailure(UniToMultiPublisher.java:104)
at io.smallrye.mutiny.operators.uni.UniOnFailureFlatMap$UniOnFailureFlatMapProcessor.dispatch(UniOnFailureFlatMap.java:85)
at io.smallrye.mutiny.operators.uni.UniOnFailureFlatMap$UniOnFailureFlatMapProcessor.onFailure(UniOnFailureFlatMap.java:60)
at io.smallrye.mutiny.operators.uni.builders.UniCreateFromCompletionStage$CompletionStageUniSubscription.forwardResult(UniCreateFromCompletionStage.java:60)
at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863)
at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510)
at java.base/java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:2194)
at io.vertx.core.Future.lambda$toCompletionStage$3(Future.java:604)
at io.vertx.core.impl.future.FutureImpl$4.onFailure(FutureImpl.java:188)
at io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:81)
at io.vertx.core.impl.future.FutureImpl.tryFail(FutureImpl.java:278)
at io.vertx.core.impl.future.Mapping.onFailure(Mapping.java:45)
at io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:81)
at io.vertx.core.impl.future.FutureImpl.tryFail(FutureImpl.java:278)
at io.vertx.core.http.impl.HttpClientImpl.lambda$doRequest$4(HttpClientImpl.java:398)
at io.vertx.core.net.impl.pool.Endpoint.lambda$getConnection$0(Endpoint.java:52)
at io.vertx.core.http.impl.SharedClientHttpStreamEndpoint$Request.handle(SharedClientHttpStreamEndpoint.java:162)
at io.vertx.core.http.impl.SharedClientHttpStreamEndpoint$Request.handle(SharedClientHttpStreamEndpoint.java:123)
at io.vertx.core.impl.ContextImpl.emit(ContextImpl.java:342)
at io.vertx.core.impl.ContextImpl.emit(ContextImpl.java:335)
at io.vertx.core.net.impl.pool.SimpleConnectionPool$ConnectFailed$1.run(SimpleConnectionPool.java:380)
at io.vertx.core.net.impl.pool.Task.runNextTasks(Task.java:43)
at io.vertx.core.net.impl.pool.CombinerExecutor.submit(CombinerExecutor.java:91)
at io.vertx.core.net.impl.pool.SimpleConnectionPool.execute(SimpleConnectionPool.java:244)
at io.vertx.core.net.impl.pool.SimpleConnectionPool.lambda$connect$2(SimpleConnectionPool.java:258)
at io.vertx.core.http.impl.SharedClientHttpStreamEndpoint.lambda$connect$2(SharedClientHttpStreamEndpoint.java:104)
at io.vertx.core.impl.future.FutureImpl$4.onFailure(FutureImpl.java:188)
at io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:81)
at io.vertx.core.impl.future.FutureImpl.tryFail(FutureImpl.java:278)
at io.vertx.core.impl.future.Composition$1.onFailure(Composition.java:66)
at io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:81)
at io.vertx.core.impl.future.FailedFuture.addListener(FailedFuture.java:98)
at io.vertx.core.impl.future.Composition.onFailure(Composition.java:55)
at io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:81)
at io.vertx.core.impl.future.FutureImpl.tryFail(FutureImpl.java:278)
at io.vertx.core.impl.ContextImpl.emit(ContextImpl.java:342)
at io.vertx.core.impl.ContextImpl.emit(ContextImpl.java:335)
at io.vertx.core.net.impl.NetClientImpl.failed(NetClientImpl.java:351)
at io.vertx.core.net.impl.NetClientImpl.lambda$connectInternal2$6(NetClientImpl.java:323)
at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:590)
at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:557)
at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:492)
at io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:636)
at io.netty.util.concurrent.DefaultPromise.setFailure0(DefaultPromise.java:629)
at io.netty.util.concurrent.DefaultPromise.setFailure(DefaultPromise.java:110)
at io.vertx.core.net.impl.ChannelProvider.lambda$handleConnect$0(ChannelProvider.java:157)
at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:590)
at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:583)
at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:559)
at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:492)
at io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:636)
at io.netty.util.concurrent.DefaultPromise.setFailure0(DefaultPromise.java:629)
at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:118)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.fulfillConnectPromise(AbstractNioChannel.java:326)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:342)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Thread.java:1583)
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: localhost/127.0.0.1:4317
Suppressed: [CIRCULAR REFERENCE: java.lang.IllegalStateException: Retries exhausted: 3/3]
Caused by: java.net.ConnectException: Connection refused
at java.base/sun.nio.ch.Net.pollConnect(Native Method)
at java.base/sun.nio.ch.Net.pollConnectNow(Net.java:682)
at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:973)
at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:336)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:339)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:776)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Thread.java:1583)
Caused by: [CIRCULAR REFERENCE: java.net.ConnectException: Connection refused]
I hope this help you to get to the root issue.
(cc @brunobat @gsmet @mkouba @sberyozkin )
@phillip-kruger this is actually good news! This error:
2024-11-20 12:45:12,256 WARNING [io.qua.ope.run.exp.otl.sen.VertxGrpcSender] (vert.x-eventloop-thread-7) Failed to export LogsRequestMarshalers. The request could not be executed. Full error message: Connection refused: localhost/127.0.0.1:4317
2024-11-20 12:45:12,258 SEVERE [io.ope.exp.int.grp.GrpcExporter] (vert.x-eventloop-thread-7) Failed to export logs. The request could not be executed. Error message: Connection refused: localhost/127.0.0.1:4317 [Error Occurred After Shutdown]: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: localhost/127.0.0.1:4317
Suppressed: java.lang.IllegalStateException: Retries exhausted: 3/3
Means you don't have a OTel collector on that endpoint, therefore telemetry could not be exported. Your fix works!
Ok great!
I am so glad there is a potential fix here. I had this issues and was about to go insane.
Describe the bug
I have an application that uses
quarkus-oidc
andquarkus-opentelemetry
extensions withquarkus.otel.logs.enabled=true
property.Start the application using
./gradlew quarkusDev
Expected behavior
Application starts in dev mode with keycloak devservice running.
Actual behavior
Application does not start because of ChainBuildException
How to Reproduce?
Reproducer repository: https://github.com/BogdanAlexa1234/quarkus-repro-otel-log-keycloak
Output of
uname -a
orver
Linux bogdan 5.15.0-124-generic #134-Ubuntu SMP Fri Sep 27 20:20:17 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Output of
java -version
openjdk 17.0.8 2023-07-18 OpenJDK Runtime Environment Temurin-17.0.8+7 (build 17.0.8+7) OpenJDK 64-Bit Server VM Temurin-17.0.8+7 (build 17.0.8+7, mixed mode, sharing)
Quarkus version or git rev
3.16.0
Build tool (ie. output of
mvnw --version
orgradlew --version
)Gradle 8.10
Additional information
No response