quarkusio / quarkus

Quarkus: Supersonic Subatomic Java.
https://quarkus.io
Apache License 2.0
13.84k stars 2.7k forks source link

[CI] - Quarkus Super Heroes + Quarkus main #23612

Closed quarkus-super-heroes-bot closed 2 months ago

quarkus-super-heroes-bot commented 2 years ago

This issue will be open and closed dependent on the state of https://github.com/quarkusio/quarkus-super-heroes building against Quarkus main snapshot.

If you have interest in being notified of this subscribe to the issue.

Closing #23425 in favor of this one.

ozangunalp commented 2 years ago

I may have isolated a change that can lead to this. I am working on a fix.

quarkus-super-heroes-bot commented 2 years ago

The build is still failing:

quarkus-super-heroes-bot commented 2 years ago

The build is still failing:

quarkus-super-heroes-bot commented 2 years ago

The build is still failing:

quarkus-super-heroes-bot commented 1 year ago

The build is still failing:

quarkus-super-heroes-bot commented 1 year ago

The build is still failing:

edeandrea commented 1 year ago

@ozangunalp any further word you might have on this?

ozangunalp commented 1 year ago

@edeandrea here is the state of my analysis: I don't want to blindly revert the change that breaks this code – which itself fixes a bug in the Kafka connector. I am trying to understand how the code from SuperStats worked in the first place. Independent of the change, I don't think it was right to use mutiny context to store the message. (+ ack it before returning to the stream).

edeandrea commented 1 year ago

It stored the message in the context, then did some processing on that message. 2 different aggregations. 1 aggregation gets sent off to its own channel. After that, it continues processing and aggregating it differently, then returns it to a different channel. Both channels are in memory channels.

I couldn't figure out a way without using the context to hang onto the original message so that it could be ack'ed after both aggregations had completed.

edeandrea commented 1 year ago

Plus all the OpenTelemetry stuff is mixed in there too (capturing the parent span off the message, creating new child spans, closing then).

I'd love to remove all that otel complexity, but last time I tried then the spans didn't carry over the message properly. That was a few Quarkus versions ago though.

quarkus-super-heroes-bot commented 1 year ago

The build is still failing:

ozangunalp commented 1 year ago

@edeandrea #195 is a temporary fix for this issue. I am working on a fix in Reactive Messaging, which may take more time.

edeandrea commented 1 year ago

Thank you @ozangunalp I merged it. Will see if this turns green tomorrow!

Thank you very much for your help!

quarkus-super-heroes-bot commented 1 year ago

Build fixed:

quarkus-super-heroes-bot commented 1 year ago

Unfortunately, the build failed:

quarkus-super-heroes-bot commented 1 year ago

The build is still failing:

edeandrea commented 1 year ago

@geoand I started looking at this most recent failure, which I can reproduce on my local machine.

For some reason when failsafe is running the ITs, the Quarkus app gets restarted multiple times. 3 times to be exact:

This restarting is what is causing the tests to fail. The ITs have an ordered sequence of tests that do things (i.e. one test creates data that the next test assumes is there). The fact that the app is restarting is killing the state of the app and causing test expectations to fail.

Any thoughts as to why this might be happening? Its only a single IT test class using @QuarkusIntegrationTest.

quarkus-super-heroes-bot commented 1 year ago

The build is still failing:

geoand commented 1 year ago

@edeandrea https://github.com/quarkusio/quarkus/pull/29752 fixes the restart issue. If you include that fix, then you'll see the ITs failing with:

See entire output ```posh [INFO] Listening for transport dt_socket at address: 5005 [INFO] Running io.quarkus.sample.superheroes.fight.rest.FightResourceIT 10:07:47 INFO [or.te.do.DockerClientProviderStrategy] (build-11) Loaded org.testcontainers.dockerclient.UnixSocketClientProviderStrategy from ~/.testcontainers.properties, will try it first Dec 08, 2022 10:07:48 AM io.quarkus.kubernetes.client.deployment.DevServicesKubernetesProcessor startKubernetes INFO: Not starting dev services for Kubernetes, the kubernetes client is auto-configured. Set quarkus.kubernetes-client.devservices.override-kubeconfig to true to use dev services for Kubernetes. 10:07:48 INFO [or.te.do.DockerClientProviderStrategy] (build-11) Found Docker environment with local Unix socket (unix:///var/run/docker.sock) 10:07:48 INFO [or.te.DockerClientFactory] (build-11) Docker host IP address is localhost 10:07:48 INFO [or.te.DockerClientFactory] (build-11) Connected to docker: Server Version: 20.10.21 API Version: 1.41 Operating System: Ubuntu 20.04.5 LTS Total Memory: 64231 MB 10:07:48 INFO [or.te.ut.ImageNameSubstitutor] (build-11) Image name substitution will be performed by: DefaultImageNameSubstitutor (composite of 'ConfigurationFileImageNameSubstitutor' and 'PrefixingImageNameSubstitutor') 10:07:48 INFO [or.te.DockerClientFactory] (build-11) Checking the system... 10:07:48 INFO [or.te.DockerClientFactory] (build-11) ✔︎ Docker server version should be at least 1.6.0 10:07:48 INFO [🐳.io.2.3.Final]] (build-11) Creating container for image: quay.io/apicurio/apicurio-registry-mem:2.2.3.Final 10:07:48 INFO [🐳.io.4]] (build-1) Creating container for image: docker.io/mongo:4.4 10:07:48 INFO [🐳.io.3.4]] (build-15) Creating container for image: docker.io/vectorized/redpanda:v22.3.4 10:07:48 INFO [or.te.ut.RegistryAuthLocator] (build-11) Failure when attempting to lookup auth config. Please ignore if you don't have images in an authenticated registry. Details: (dockerImageName: quay.io/apicurio/apicurio-registry-mem:2.2.3.Final, configFile: /home/gandrian/.docker/config.json. Falling back to docker-java default behaviour. Exception message: Could not execute [docker-credential-pass, get]. Error=2, No such file or directory 10:07:48 INFO [or.te.ut.RegistryAuthLocator] (build-1) Failure when attempting to lookup auth config. Please ignore if you don't have images in an authenticated registry. Details: (dockerImageName: docker.io/mongo:4.4, configFile: /home/gandrian/.docker/config.json. Falling back to docker-java default behaviour. Exception message: Could not execute [docker-credential-pass, get]. Error=2, No such file or directory 10:07:48 INFO [🐳.3.4]] (build-11) Creating container for image: testcontainers/ryuk:0.3.4 10:07:48 INFO [or.te.ut.RegistryAuthLocator] (build-11) Failure when attempting to lookup auth config. Please ignore if you don't have images in an authenticated registry. Details: (dockerImageName: testcontainers/ryuk:0.3.4, configFile: /home/gandrian/.docker/config.json. Falling back to docker-java default behaviour. Exception message: Could not execute [docker-credential-pass, get]. Error=2, No such file or directory 10:07:48 INFO [🐳.3.4]] (build-11) Container testcontainers/ryuk:0.3.4 is starting: 90b92222555b6bd8abc3f220a0bb7cc0f3c7a4107f6375e527bee11e9c4c596c 10:07:48 INFO [🐳.3.4]] (build-11) Container testcontainers/ryuk:0.3.4 started in PT0.392643178S 10:07:48 INFO [🐳.io.2.3.Final]] (build-11) Container quay.io/apicurio/apicurio-registry-mem:2.2.3.Final is starting: c282b53ab89f24e7b792be7893389460cf80b728f89070148202145433a53825 10:07:48 INFO [🐳.io.4]] (build-1) Container docker.io/mongo:4.4 is starting: 6ae32f30563b51a0a86ede3b9d3ef266ee0c97b8b0a2c44c340e985a37d16007 10:07:48 INFO [🐳.io.3.4]] (build-15) Container docker.io/vectorized/redpanda:v22.3.4 is starting: 291af0503ec4e95f14601e6933860cbebca494c77eaa1efadf7ea213c9f2e710 10:07:49 INFO [🐳.io.4]] (build-1) Container docker.io/mongo:4.4 started in PT1.563664804S 10:07:49 INFO [🐳.io.3.4]] (build-15) Container docker.io/vectorized/redpanda:v22.3.4 started in PT1.647121349S Dec 08, 2022 10:07:49 AM io.quarkus.kafka.client.deployment.DevServicesKafkaProcessor startKafkaDevService INFO: Dev Services for Kafka started. Other Quarkus applications in dev mode will find the broker automatically. For Quarkus applications in production mode, you can connect to this by starting your application with -Dkafka.bootstrap.servers=OUTSIDE://localhost:49238 10:07:50 INFO [🐳.io.2.3.Final]] (build-11) Container quay.io/apicurio/apicurio-registry-mem:2.2.3.Final started in PT2.297801332S Dec 08, 2022 10:07:50 AM io.quarkus.apicurio.registry.devservice.DevServicesApicurioRegistryProcessor startApicurioRegistryDevService INFO: Dev Services for Apicurio Registry started. The registry is available at http://localhost:49236/apis/registry/v2 10:07:50 INFO [or.ec.je.ut.log] (main) Logging initialized @17502ms to org.eclipse.jetty.util.log.Slf4jLog 10:07:50 INFO [or.ec.je.se.Server] (pool-3-thread-1) jetty-9.4.49.v20220914; built: 2022-09-14T01:07:36.601Z; git: 4231a3b2e4cb8548a412a789936d640a97b1aa0a; jvm 17.0.4.1+1 10:07:50 INFO [or.ec.je.se.ha.ContextHandler] (pool-3-thread-1) Started o.e.j.s.ServletContextHandler@643e47ae{/__admin,null,AVAILABLE} 10:07:50 INFO [or.ec.je.se.ha.Co.ROOT] (pool-3-thread-1) RequestHandlerClass from context returned com.github.tomakehurst.wiremock.http.StubRequestHandler. Normalized mapped under returned 'null' 10:07:50 INFO [or.ec.je.se.ha.ContextHandler] (pool-3-thread-1) Started o.e.j.s.ServletContextHandler@169fe6bc{/,null,AVAILABLE} 10:07:50 INFO [or.ec.je.se.AbstractConnector] (pool-3-thread-1) Started NetworkTrafficServerConnector@2e0ad709{HTTP/1.1, (http/1.1, h2c)}{0.0.0.0:34047} 10:07:50 INFO [or.ec.je.se.Server] (pool-3-thread-1) Started @17655ms Executing "/home/gandrian/.sdkman/candidates/java/17.0.4.1-tem/bin/java -Dquarkus.http.port=0 -Dquarkus.http.ssl-port=8444 -Dtest.url=http://localhost:0 -Dquarkus.log.file.path=/home/gandrian/projects/redhat/quarkus-super-heroes/rest-fights/target/quarkus.log -Dquarkus.log.file.enable=true -Dquarkus.stork.villain-service.service-discovery.address-list=localhost:34047 -Dmp.messaging.connector.smallrye-kafka.apicurio.registry.url=http://localhost:49236/apis/registry/v2 -Dkafka.bootstrap.servers=OUTSIDE://localhost:49238 -Dquarkus.mongodb.connection-string=mongodb://localhost:49237/fights -Dquarkus.stork.hero-service.service-discovery.address-list=localhost:34047 -Dmp.messaging.connector.smallrye-kafka.schema.registry.url=http://localhost:49236/apis/ccompat/v6 -jar /home/gandrian/projects/redhat/quarkus-super-heroes/rest-fights/target/quarkus-app/quarkus-run.jar" __ ____ __ _____ ___ __ ____ ______ --/ __ \/ / / / _ | / _ \/ //_/ / / / __/ -/ /_/ / /_/ / __ |/ , _/ ,< / /_/ /\ \ --\___\_\____/_/ |_/_/|_/_/|_|\____/___/ 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [or.mo.dr.client] (main) MongoClient with metadata {"driver": {"name": "mongo-java-driver|sync", "version": "4.8.0"}, "os": {"type": "Linux", "name": "Linux", "architecture": "amd64", "version": "5.15.0-56-generic"}, "platform": "Java/Eclipse Adoptium/17.0.4.1+1"} created with settings MongoClientSettings{readPreference=primary, writeConcern=WriteConcern{w=null, wTimeout=null ms, journal=null}, retryWrites=true, retryReads=true, readConcern=ReadConcern{level=null}, credential=null, streamFactoryFactory=null, commandListeners=[], codecRegistry=ProvidersCodecRegistry{codecProviders=[ValueCodecProvider{}, BsonValueCodecProvider{}, DBRefCodecProvider{}, DBObjectCodecProvider{}, DocumentCodecProvider{}, CollectionCodecProvider{}, IterableCodecProvider{}, MapCodecProvider{}, GeoJsonCodecProvider{}, GridFSFileCodecProvider{}, Jsr310CodecProvider{}, JsonObjectCodecProvider{}, BsonCodecProvider{}, EnumCodecProvider{}, com.mongodb.Jep395RecordCodecProvider@2f79f192]}, clusterSettings={hosts=[localhost:49237], srvServiceName=mongodb, mode=SINGLE, requiredClusterType=UNKNOWN, requiredReplicaSetName='null', serverSelector='null', clusterListeners='[]', serverSelectionTimeout='30000 ms', localThreshold='30000 ms'}, socketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=0, receiveBufferSize=0, sendBufferSize=0}, heartbeatSocketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=10000, receiveBufferSize=0, sendBufferSize=0}, connectionPoolSettings=ConnectionPoolSettings{maxSize=100, minSize=0, maxWaitTimeMS=120000, maxConnectionLifeTimeMS=0, maxConnectionIdleTimeMS=0, maintenanceInitialDelayMS=0, maintenanceFrequencyMS=60000, connectionPoolListeners=[], maxConnecting=2}, serverSettings=ServerSettings{heartbeatFrequencyMS=10000, minHeartbeatFrequencyMS=500, serverListeners='[]', serverMonitorListeners='[]'}, sslSettings=SslSettings{enabled=false, invalidHostNameAllowed=false, context=null}, applicationName='null', compressorList=[], uuidRepresentation=UNSPECIFIED, serverApi=null, autoEncryptionSettings=null, contextProvider=null} 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [or.mo.dr.cluster] (cluster-ClusterId{value='63919b578376db5c3061f872', description='null'}-localhost:49237) Monitor thread successfully connected to server with description ServerDescription{address=localhost:49237, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=11414474, setName='docker-rs', canonicalAddress=6ae32f30563b:27017, hosts=[6ae32f30563b:27017], passives=[], arbiters=[], primary='6ae32f30563b:27017', tagSet=TagSet{[]}, electionId=7fffffff0000000000000001, setVersion=1, topologyVersion=TopologyVersion{processId=63919b554389fdef18a6746f, counter=6}, lastWriteDate=Thu Dec 08 10:07:49 EET 2022, lastUpdateTimeNanos=8491235315239} 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.parser] (main) INFO: An older version of the XSD is specified in one or more changelog's header. This can lead to unexpected outcomes. If a specific XSD is not required, please replace all XSD version references with "-latest". Learn more at https://docs.liquibase.com INFO: An older version of the XSD is specified in one or more changelog's header. This can lead to unexpected outcomes. If a specific XSD is not required, please replace all XSD version references with "-latest". Learn more at https://docs.liquibase.com 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [or.mo.dr.client] (main) MongoClient with metadata {"driver": {"name": "mongo-java-driver|sync", "version": "4.8.0"}, "os": {"type": "Linux", "name": "Linux", "architecture": "amd64", "version": "5.15.0-56-generic"}, "platform": "Java/Eclipse Adoptium/17.0.4.1+1"} created with settings MongoClientSettings{readPreference=primary, writeConcern=WriteConcern{w=null, wTimeout=null ms, journal=null}, retryWrites=true, retryReads=true, readConcern=ReadConcern{level=null}, credential=null, streamFactoryFactory=null, commandListeners=[], codecRegistry=ProvidersCodecRegistry{codecProviders=[ValueCodecProvider{}, BsonValueCodecProvider{}, DBRefCodecProvider{}, DBObjectCodecProvider{}, DocumentCodecProvider{}, CollectionCodecProvider{}, IterableCodecProvider{}, MapCodecProvider{}, GeoJsonCodecProvider{}, GridFSFileCodecProvider{}, Jsr310CodecProvider{}, JsonObjectCodecProvider{}, BsonCodecProvider{}, EnumCodecProvider{}, com.mongodb.Jep395RecordCodecProvider@2f79f192]}, clusterSettings={hosts=[localhost:49237], srvServiceName=mongodb, mode=SINGLE, requiredClusterType=UNKNOWN, requiredReplicaSetName='null', serverSelector='null', clusterListeners='[]', serverSelectionTimeout='30000 ms', localThreshold='30000 ms'}, socketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=0, receiveBufferSize=0, sendBufferSize=0}, heartbeatSocketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=10000, receiveBufferSize=0, sendBufferSize=0}, connectionPoolSettings=ConnectionPoolSettings{maxSize=100, minSize=0, maxWaitTimeMS=120000, maxConnectionLifeTimeMS=0, maxConnectionIdleTimeMS=0, maintenanceInitialDelayMS=0, maintenanceFrequencyMS=60000, connectionPoolListeners=[], maxConnecting=2}, serverSettings=ServerSettings{heartbeatFrequencyMS=10000, minHeartbeatFrequencyMS=500, serverListeners='[]', serverMonitorListeners='[]'}, sslSettings=SslSettings{enabled=false, invalidHostNameAllowed=false, context=null}, applicationName='null', compressorList=[], uuidRepresentation=UNSPECIFIED, serverApi=null, autoEncryptionSettings=null, contextProvider=null} 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [or.mo.dr.cluster] (cluster-ClusterId{value='63919b578376db5c3061f873', description='null'}-localhost:49237) Monitor thread successfully connected to server with description ServerDescription{address=localhost:49237, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=1064810, setName='docker-rs', canonicalAddress=6ae32f30563b:27017, hosts=[6ae32f30563b:27017], passives=[], arbiters=[], primary='6ae32f30563b:27017', tagSet=TagSet{[]}, electionId=7fffffff0000000000000001, setVersion=1, topologyVersion=TopologyVersion{processId=63919b554389fdef18a6746f, counter=6}, lastWriteDate=Thu Dec 08 10:07:49 EET 2022, lastUpdateTimeNanos=8491445835852} 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Create Database Lock Collection: fights.DATABASECHANGELOGLOCK 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Created database lock Collection: DATABASECHANGELOGLOCK 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Adjusting database Lock Collection with name: fights.DATABASECHANGELOGLOCK 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Adjusted database Lock Collection with name: fights.DATABASECHANGELOGLOCK 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Lock Database 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Successfully Acquired Change Log Lock 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Create Database Change Log Collection 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Creating database history collection with name: fights.DATABASECHANGELOG 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Created database history collection : fights.DATABASECHANGELOG 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Adjusting database history Collection with name: fights.DATABASECHANGELOG 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Adjusted database history Collection with name: fights.DATABASECHANGELOG Running Changeset: db/changeLog.xml::1::edeandrea 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.nosql] (main) Changeset db/changeLog.xml::1::edeandrea 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.changelog] (main) Collection Fights created 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.changelog] (main) Documents inserted into collection Fights 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.changelog] (main) ChangeSet db/changeLog.xml::1::edeandrea ran successfully in 9ms 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Release Database Lock 10:07:51 INFO traceId=, parentId=, spanId=, sampled= [li.ext] (main) Successfully released change log lock 10:07:52 WARN traceId=, parentId=, spanId=, sampled= [or.ap.ka.cl.pr.ProducerConfig] (smallrye-kafka-producer-thread-0) These configurations '[apicurio.registry.url, schema.registry.url, apicurio.registry.auto-register]' were supplied but are not used yet. 10:07:52 INFO traceId=, parentId=, spanId=, sampled= [io.sm.re.me.kafka] (smallrye-kafka-producer-thread-0) SRMSG18258: Kafka producer kafka-producer-fights, connected to Kafka brokers 'OUTSIDE://localhost:49238', is configured to write records to 'fights' 10:07:52 INFO traceId=, parentId=, spanId=, sampled= [io.quarkus] (main) rest-fights 1.0 on JVM (powered by Quarkus 999-SNAPSHOT) started in 1.425s. Listening on: http://0.0.0.0:45821 10:07:52 INFO traceId=, parentId=, spanId=, sampled= [io.quarkus] (main) Profile prod activated. 10:07:52 INFO traceId=, parentId=, spanId=, sampled= [io.quarkus] (main) Installed features: [apicurio-registry-avro, cdi, hibernate-validator, kafka-client, kubernetes, kubernetes-client, liquibase-mongodb, micrometer, mongodb-client, mongodb-panache, narayana-jta, opentelemetry, opentelemetry-otlp-exporter, rest-client-reactive, rest-client-reactive-jackson, resteasy-reactive, resteasy-reactive-jackson, smallrye-context-propagation, smallrye-fault-tolerance, smallrye-health, smallrye-openapi, smallrye-reactive-messaging, smallrye-reactive-messaging-kafka, swagger-ui, vertx] 10:07:57 INFO traceId=04a86ca16c1f3d41c4b3aa09a8689709, parentId=33adbfede9ac2dba, spanId=df2980b746fb33b5, sampled=true [or.mo.dr.client] (vert.x-eventloop-thread-0) MongoClient with metadata {"driver": {"name": "mongo-java-driver|reactive-streams", "version": "4.8.0"}, "os": {"type": "Linux", "name": "Linux", "architecture": "amd64", "version": "5.15.0-56-generic"}, "platform": "Java/Eclipse Adoptium/17.0.4.1+1"} created with settings MongoClientSettings{readPreference=primary, writeConcern=WriteConcern{w=null, wTimeout=null ms, journal=true}, retryWrites=false, retryReads=true, readConcern=ReadConcern{level=null}, credential=null, streamFactoryFactory=null, commandListeners=[], codecRegistry=ProvidersCodecRegistry{codecProviders=[ProvidersCodecRegistry{codecProviders=[ValueCodecProvider{}, BsonValueCodecProvider{}, DBRefCodecProvider{}, DBObjectCodecProvider{}, DocumentCodecProvider{}, CollectionCodecProvider{}, IterableCodecProvider{}, MapCodecProvider{}, GeoJsonCodecProvider{}, GridFSFileCodecProvider{}, Jsr310CodecProvider{}, JsonObjectCodecProvider{}, BsonCodecProvider{}, EnumCodecProvider{}, com.mongodb.Jep395RecordCodecProvider@2f79f192]}, ProvidersCodecRegistry{codecProviders=[org.bson.codecs.pojo.PojoCodecProvider@63f0c2b1]}]}, clusterSettings={hosts=[localhost:49237], srvServiceName=mongodb, mode=SINGLE, requiredClusterType=UNKNOWN, requiredReplicaSetName='null', serverSelector='null', clusterListeners='[]', serverSelectionTimeout='30000 ms', localThreshold='30000 ms'}, socketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=0, receiveBufferSize=0, sendBufferSize=0}, heartbeatSocketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=10000, receiveBufferSize=0, sendBufferSize=0}, connectionPoolSettings=ConnectionPoolSettings{maxSize=100, minSize=0, maxWaitTimeMS=120000, maxConnectionLifeTimeMS=0, maxConnectionIdleTimeMS=0, maintenanceInitialDelayMS=0, maintenanceFrequencyMS=60000, connectionPoolListeners=[io.micrometer.core.instrument.binder.mongodb.MongoMetricsConnectionPoolListener@1d3db877], maxConnecting=2}, serverSettings=ServerSettings{heartbeatFrequencyMS=10000, minHeartbeatFrequencyMS=500, serverListeners='[]', serverMonitorListeners='[]'}, sslSettings=SslSettings{enabled=false, invalidHostNameAllowed=false, context=null}, applicationName='null', compressorList=[], uuidRepresentation=UNSPECIFIED, serverApi=null, autoEncryptionSettings=null, contextProvider=null} 10:07:57 INFO traceId=, parentId=, spanId=, sampled= [or.mo.dr.cluster] (cluster-ClusterId{value='63919b5d8376db5c3061f874', description='null'}-localhost:49237) Monitor thread successfully connected to server with description ServerDescription{address=localhost:49237, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=2615154, setName='docker-rs', canonicalAddress=6ae32f30563b:27017, hosts=[6ae32f30563b:27017], passives=[], arbiters=[], primary='6ae32f30563b:27017', tagSet=TagSet{[]}, electionId=7fffffff0000000000000001, setVersion=1, topologyVersion=TopologyVersion{processId=63919b554389fdef18a6746f, counter=6}, lastWriteDate=Thu Dec 08 10:07:51 EET 2022, lastUpdateTimeNanos=8497028323993} 10:08:01 SEVERE traceId=, parentId=, spanId=, sampled= [io.op.ex.in.gr.OkHttpGrpcExporter] (OkHttp http://localhost:4317/...) Failed to export spans. The request could not be executed. Full error message: Failed to connect to localhost/127.0.0.1:4317 10:08:01 WARN traceId=b1fe521cb56cd2eeb203cee02932ba2d, parentId=, spanId=30a0e813c274857c, sampled=true [io.qu.sa.su.fi.se.FightService] (vert.x-eventloop-thread-19) Falling back on Hero 10:08:02 WARN traceId=b1fe521cb56cd2eeb203cee02932ba2d, parentId=, spanId=30a0e813c274857c, sampled=true [io.qu.sa.su.fi.se.FightService] (vert.x-eventloop-thread-19) Falling back on Villain 10:08:05 WARN traceId=7f18e04a33d4e640092c11b777cd32f6, parentId=, spanId=cb444fa0d7f4af84, sampled=true [io.qu.sa.su.fi.se.FightService] (vert.x-eventloop-thread-25) Falling back on Villain 10:08:06 SEVERE traceId=, parentId=, spanId=, sampled= [io.op.ex.in.gr.OkHttpGrpcExporter] (OkHttp http://localhost:4317/...) Failed to export spans. The request could not be executed. Full error message: Failed to connect to localhost/127.0.0.1:4317 10:08:11 SEVERE traceId=, parentId=, spanId=, sampled= [io.op.ex.in.gr.OkHttpGrpcExporter] (OkHttp http://localhost:4317/...) Failed to export spans. The request could not be executed. Full error message: Failed to connect to localhost/127.0.0.1:4317 10:08:15 WARN traceId=, parentId=, spanId=, sampled= [io.qu.sa.su.fi.se.FightService] (executor-thread-0) Could not invoke the Heroes microservice 10:08:15 WARN traceId=b5d550e174c3bb15508ebe7066ffb45b, parentId=, spanId=5e80c9f63a1ac0ca, sampled=true [io.qu.sa.su.fi.se.FightService] (vert.x-eventloop-thread-30) Could not invoke the Heroes microservice 10:08:16 SEVERE traceId=, parentId=, spanId=, sampled= [io.op.ex.in.gr.OkHttpGrpcExporter] (OkHttp http://localhost:4317/...) Failed to export spans. The request could not be executed. Full error message: Failed to connect to localhost/127.0.0.1:4317 10:08:20 WARN traceId=9c7366d94f423a7f841ce67b34eb17cd, parentId=, spanId=eb2f0717d1bc4994, sampled=true [io.qu.sa.su.fi.se.FightService] (vert.x-eventloop-thread-17) Falling back on Hero 10:08:20 WARN traceId=92dd7311e86ba5a32fea3ec9cbc45c90, parentId=, spanId=1816c97f34163a89, sampled=true [io.qu.sa.su.fi.se.FightService] (vert.x-eventloop-thread-18) Could not invoke the Villains microservice 10:08:21 SEVERE traceId=, parentId=, spanId=, sampled= [io.op.ex.in.gr.OkHttpGrpcExporter] (OkHttp http://localhost:4317/...) Failed to export spans. The request could not be executed. Full error message: Failed to connect to localhost/127.0.0.1:4317 10:08:25 WARN traceId=, parentId=, spanId=, sampled= [io.qu.sa.su.fi.se.FightService] (executor-thread-0) Could not invoke the Villains microservice 10:08:25 INFO [or.ap.ka.cl.co.ConsumerConfig] (main) ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = latest bootstrap.servers = [OUTSIDE://localhost:49238] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = companion-ddc5a533-3128-43a6-80a7-fa973e788484 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = fights group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class io.apicurio.registry.serde.avro.AvroKafkaDeserializer 10:08:25 WARN [or.ap.ka.cl.co.ConsumerConfig] (main) These configurations '[apicurio.registry.url, schema.registry.url, apicurio.registry.auto-register, apicurio.registry.avro-datum-provider]' were supplied but are not used yet. 10:08:25 INFO [or.ap.ka.co.ut.AppInfoParser] (main) Kafka version: 3.3.1 10:08:25 INFO [or.ap.ka.co.ut.AppInfoParser] (main) Kafka commitId: e23c59d00e687ff5 10:08:25 INFO [or.ap.ka.co.ut.AppInfoParser] (main) Kafka startTimeMs: 1670486905418 10:08:25 INFO [or.ap.ka.cl.co.KafkaConsumer] (main) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Subscribed to topic(s): fights 10:08:25 INFO traceId=a44ddf4d23183814945af785a94fe872, parentId=, spanId=ef41d42f19d044a4, sampled=true [io.qu.sa.su.fi.se.FightService] (vert.x-eventloop-thread-16) Yes, Hero Super Baguette won over Super Chocolatine :o) 10:08:25 INFO [or.ap.ka.cl.Metadata] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Cluster ID: redpanda.3f63e2ac-073e-427f-9479-703aa755da56 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] (Re-)joining group 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: error response NOT_COORDINATOR. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] JoinGroup failed: This is not the correct coordinator. Marking coordinator unknown. Sent generation was Generation{generationId=-1, memberId='', protocol='null'} 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Request joining group due to: rebalance failed due to 'This is not the correct coordinator.' (NotCoordinatorException) 10:08:25 INFO [or.ap.ka.cl.NetworkClient] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Client requested disconnect from node 2147483647 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] (Re-)joining group 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: error response NOT_COORDINATOR. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] JoinGroup failed: This is not the correct coordinator. Marking coordinator unknown. Sent generation was Generation{generationId=-1, memberId='', protocol='null'} 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Request joining group due to: rebalance failed due to 'This is not the correct coordinator.' (NotCoordinatorException) 10:08:25 INFO [or.ap.ka.cl.NetworkClient] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Client requested disconnect from node 2147483647 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: true. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] (Re-)joining group 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: error response NOT_COORDINATOR. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] JoinGroup failed: This is not the correct coordinator. Marking coordinator unknown. Sent generation was Generation{generationId=-1, memberId='', protocol='null'} 10:08:25 INFO [or.ap.ka.cl.NetworkClient] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Client requested disconnect from node 2147483647 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Group coordinator localhost:49238 (id: 2147483647 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: false. Rediscovery will be attempted. 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Requesting disconnect from last known coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Request joining group due to: rebalance failed due to 'This is not the correct coordinator.' (NotCoordinatorException) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] (Re-)joining group 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Request joining group due to: need to re-join with the given member-id: companion-ddc5a533-3128-43a6-80a7-fa973e788484-0f3f53c3-5c86-4085-8c63-7424b177fd2a 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Request joining group due to: rebalance failed due to 'The group member needs to have a valid member id before actually entering a consumer group.' (MemberIdRequiredException) 10:08:25 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] (Re-)joining group 10:08:26 SEVERE traceId=, parentId=, spanId=, sampled= [io.op.ex.in.gr.OkHttpGrpcExporter] (OkHttp http://localhost:4317/...) Failed to export spans. The request could not be executed. Full error message: Failed to connect to localhost/127.0.0.1:4317 10:08:28 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Successfully joined group with generation Generation{generationId=1, memberId='companion-ddc5a533-3128-43a6-80a7-fa973e788484-0f3f53c3-5c86-4085-8c63-7424b177fd2a', protocol='range'} 10:08:28 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Finished assignment for group at generation 1: {companion-ddc5a533-3128-43a6-80a7-fa973e788484-0f3f53c3-5c86-4085-8c63-7424b177fd2a=Assignment(partitions=[fights-0])} 10:08:28 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Successfully synced group in generation Generation{generationId=1, memberId='companion-ddc5a533-3128-43a6-80a7-fa973e788484-0f3f53c3-5c86-4085-8c63-7424b177fd2a', protocol='range'} 10:08:28 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Notifying assignor about the new Assignment(partitions=[fights-0]) 10:08:28 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Adding newly assigned partitions: fights-0 Dec 08, 2022 10:08:28 AM io.smallrye.reactive.messaging.kafka.companion.ConsumerBuilder onPartitionsAssigned INFO: companion-ddc5a533-3128-43a6-80a7-fa973e788484 assigned partitions [fights-0] 10:08:28 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Found no committed offset for partition fights-0 10:08:28 INFO [or.ap.ka.cl.co.in.SubscriptionState] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Resetting offset for partition fights-0 to position FetchPosition{offset=1, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:49238 (id: 0 rack: null)], epoch=absent}}. 10:08:35 INFO [or.ap.ka.cl.co.ConsumerConfig] (main) ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = latest bootstrap.servers = [OUTSIDE://localhost:49238] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = companion-7ec16a33-2505-4e25-8757-48e491ac520e client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = fights group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class io.apicurio.registry.serde.avro.AvroKafkaDeserializer 10:08:35 WARN [or.ap.ka.cl.co.ConsumerConfig] (main) These configurations '[apicurio.registry.url, schema.registry.url, apicurio.registry.auto-register, apicurio.registry.avro-datum-provider]' were supplied but are not used yet. 10:08:35 INFO [or.ap.ka.co.ut.AppInfoParser] (main) Kafka version: 3.3.1 10:08:35 INFO [or.ap.ka.co.ut.AppInfoParser] (main) Kafka commitId: e23c59d00e687ff5 10:08:35 INFO [or.ap.ka.co.ut.AppInfoParser] (main) Kafka startTimeMs: 1670486915544 10:08:35 INFO [or.ap.ka.cl.co.KafkaConsumer] (main) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] Subscribed to topic(s): fights 10:08:35 INFO [or.ap.ka.cl.Metadata] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] Cluster ID: redpanda.3f63e2ac-073e-427f-9479-703aa755da56 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] Discovered group coordinator localhost:49238 (id: 2147483647 rack: null) 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] (Re-)joining group 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] Request joining group due to: need to re-join with the given member-id: companion-7ec16a33-2505-4e25-8757-48e491ac520e-f94a7c84-e386-4242-a8d8-05684b01f9bc 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] Request joining group due to: rebalance failed due to 'The group member needs to have a valid member id before actually entering a consumer group.' (MemberIdRequiredException) 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] (Re-)joining group 10:08:35 INFO traceId=4ee315b7dbfcc19a4c383d44c8c565b3, parentId=, spanId=78d70d016ab9a993, sampled=true [io.qu.sa.su.fi.se.FightService] (vert.x-eventloop-thread-22) Gee, Villain Super Chocolatine won over Super Baguette :o( Dec 08, 2022 10:08:35 AM io.smallrye.reactive.messaging.kafka.companion.ConsumerBuilder lambda$close$2 INFO: Closing consumer companion-ddc5a533-3128-43a6-80a7-fa973e788484 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Revoke previously assigned partitions fights-0 Dec 08, 2022 10:08:35 AM io.smallrye.reactive.messaging.kafka.companion.ConsumerBuilder onPartitionsRevoked INFO: companion-ddc5a533-3128-43a6-80a7-fa973e788484 revoked partitions [fights-0] 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Member companion-ddc5a533-3128-43a6-80a7-fa973e788484-0f3f53c3-5c86-4085-8c63-7424b177fd2a sending LeaveGroup request to coordinator localhost:49238 (id: 2147483647 rack: null) due to the consumer is being closed 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Resetting generation and member id due to: consumer pro-actively leaving the group 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) [Consumer clientId=companion-ddc5a533-3128-43a6-80a7-fa973e788484, groupId=fights] Request joining group due to: consumer pro-actively leaving the group 10:08:35 INFO [or.ap.ka.co.me.Metrics] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) Metrics scheduler closed 10:08:35 INFO [or.ap.ka.co.me.Metrics] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) Closing reporter org.apache.kafka.common.metrics.JmxReporter 10:08:35 INFO [or.ap.ka.co.me.Metrics] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) Metrics reporters closed 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] Successfully joined group with generation Generation{generationId=2, memberId='companion-7ec16a33-2505-4e25-8757-48e491ac520e-f94a7c84-e386-4242-a8d8-05684b01f9bc', protocol='range'} 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] Finished assignment for group at generation 2: {companion-7ec16a33-2505-4e25-8757-48e491ac520e-f94a7c84-e386-4242-a8d8-05684b01f9bc=Assignment(partitions=[fights-0])} 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] Successfully synced group in generation Generation{generationId=2, memberId='companion-7ec16a33-2505-4e25-8757-48e491ac520e-f94a7c84-e386-4242-a8d8-05684b01f9bc', protocol='range'} 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] Notifying assignor about the new Assignment(partitions=[fights-0]) 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] Adding newly assigned partitions: fights-0 Dec 08, 2022 10:08:35 AM io.smallrye.reactive.messaging.kafka.companion.ConsumerBuilder onPartitionsAssigned INFO: companion-7ec16a33-2505-4e25-8757-48e491ac520e assigned partitions [fights-0] 10:08:35 INFO [or.ap.ka.cl.co.in.ConsumerCoordinator] (consumer-companion-7ec16a33-2505-4e25-8757-48e491ac520e) [Consumer clientId=companion-7ec16a33-2505-4e25-8757-48e491ac520e, groupId=fights] Setting offset for partition fights-0 to the committed offset FetchPosition{offset=2, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:49238 (id: 0 rack: null)], epoch=absent}} 10:08:35 INFO [or.ap.ka.co.ut.AppInfoParser] (consumer-companion-ddc5a533-3128-43a6-80a7-fa973e788484) App info kafka.consumer for companion-ddc5a533-3128-43a6-80a7-fa973e788484 unregistered 10:08:36 SEVERE traceId=, parentId=, spanId=, sampled= [io.op.ex.in.gr.OkHttpGrpcExporter] (OkHttp http://localhost:4317/...) Failed to export spans. The request could not be executed. Full error message: Failed to connect to localhost/127.0.0.1:4317 [ERROR] Tests run: 30, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 64.554 s <<< FAILURE! - in io.quarkus.sample.superheroes.fight.rest.FightResourceIT [ERROR] io.quarkus.sample.superheroes.fight.rest.FightResourceIT.performFightHeroWins Time elapsed: 10.22 s <<< FAILURE! java.lang.AssertionError: No completion (or failure) event received in the last 10000 ms at io.smallrye.mutiny.helpers.test.AssertSubscriber.awaitCompletion(AssertSubscriber.java:486) at io.smallrye.mutiny.helpers.test.AssertSubscriber.awaitCompletion(AssertSubscriber.java:470) at io.smallrye.reactive.messaging.kafka.companion.KafkaTask.awaitCompletion(KafkaTask.java:197) at io.quarkus.sample.superheroes.fight.rest.FightResourceIT.performFightHeroWins(FightResourceIT.java:718) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:727) at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) at io.quarkus.test.junit.QuarkusTestExtension.interceptTestMethod(QuarkusTestExtension.java:805) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(InterceptingExecutableInvoker.java:103) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.lambda$invoke$0(InterceptingExecutableInvoker.java:93) at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:156) at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:147) at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:86) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(InterceptingExecutableInvoker.java:103) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.lambda$invoke$0(InterceptingExecutableInvoker.java:93) at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.invoke(InterceptingExecutableInvoker.java:92) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.invoke(InterceptingExecutableInvoker.java:86) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:217) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:213) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:138) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:68) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:147) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:127) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:90) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:55) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:102) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:54) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:114) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:86) at org.junit.platform.launcher.core.DefaultLauncherSession$DelegatingLauncher.execute(DefaultLauncherSession.java:86) at org.apache.maven.surefire.junitplatform.LazyLauncher.execute(LazyLauncher.java:55) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:223) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:175) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:139) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:456) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:169) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:595) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:581) [ERROR] io.quarkus.sample.superheroes.fight.rest.FightResourceIT.performFightVillainWins Time elapsed: 10.054 s <<< FAILURE! java.lang.AssertionError: No completion (or failure) event received in the last 10000 ms at io.smallrye.mutiny.helpers.test.AssertSubscriber.awaitCompletion(AssertSubscriber.java:486) at io.smallrye.mutiny.helpers.test.AssertSubscriber.awaitCompletion(AssertSubscriber.java:470) at io.smallrye.reactive.messaging.kafka.companion.KafkaTask.awaitCompletion(KafkaTask.java:197) at io.quarkus.sample.superheroes.fight.rest.FightResourceIT.performFightVillainWins(FightResourceIT.java:794) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:727) at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60) at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131) at io.quarkus.test.junit.QuarkusTestExtension.interceptTestMethod(QuarkusTestExtension.java:805) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(InterceptingExecutableInvoker.java:103) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.lambda$invoke$0(InterceptingExecutableInvoker.java:93) at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:156) at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:147) at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:86) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(InterceptingExecutableInvoker.java:103) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.lambda$invoke$0(InterceptingExecutableInvoker.java:93) at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.invoke(InterceptingExecutableInvoker.java:92) at org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.invoke(InterceptingExecutableInvoker.java:86) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:217) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:213) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:138) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:68) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:147) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:127) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:90) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:55) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:102) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:54) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:114) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:86) at org.junit.platform.launcher.core.DefaultLauncherSession$DelegatingLauncher.execute(DefaultLauncherSession.java:86) at org.apache.maven.surefire.junitplatform.LazyLauncher.execute(LazyLauncher.java:55) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:223) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:175) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:139) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:456) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:169) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:595) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:581) 10:08:46 SEVERE traceId=, parentId=, spanId=, sampled= [io.op.ex.in.gr.OkHttpGrpcExporter] (OkHttp http://localhost:4317/...) Failed to export spans. The request could not be executed. Full error message: Failed to connect to localhost/127.0.0.1:4317 10:09:00 WARN [io.ve.co.im.BlockedThreadChecker] (vertx-blocked-thread-checker) Thread Thread[vert.x-eventloop-thread-61,5,main] has been blocked for 14456 ms, time limit is 2000 ms: io.vertx.core.VertxException: Thread blocked at java.base@17.0.4.1/jdk.internal.org.objectweb.asm.SymbolTable.hash(SymbolTable.java:1290) at java.base@17.0.4.1/jdk.internal.org.objectweb.asm.SymbolTable.addConstantMemberReference(SymbolTable.java:590) at java.base@17.0.4.1/jdk.internal.org.objectweb.asm.SymbolTable.addConstantMethodref(SymbolTable.java:573) at java.base@17.0.4.1/jdk.internal.org.objectweb.asm.MethodWriter.visitMethodInsn(MethodWriter.java:1066) at java.base@17.0.4.1/java.lang.invoke.InnerClassLambdaMetafactory.generateConstructor(InnerClassLambdaMetafactory.java:452) at java.base@17.0.4.1/java.lang.invoke.InnerClassLambdaMetafactory.generateInnerClass(InnerClassLambdaMetafactory.java:357) at java.base@17.0.4.1/java.lang.invoke.InnerClassLambdaMetafactory.spinInnerClass(InnerClassLambdaMetafactory.java:315) at java.base@17.0.4.1/java.lang.invoke.InnerClassLambdaMetafactory.buildCallSite(InnerClassLambdaMetafactory.java:228) at java.base@17.0.4.1/java.lang.invoke.LambdaMetafactory.metafactory(LambdaMetafactory.java:341) at java.base@17.0.4.1/java.lang.invoke.LambdaForm$DMH/0x0000000800c34c00.invokeStatic(LambdaForm$DMH) at java.base@17.0.4.1/java.lang.invoke.Invokers$Holder.invokeExact_MT(Invokers$Holder) at java.base@17.0.4.1/java.lang.invoke.BootstrapMethodInvoker.invoke(BootstrapMethodInvoker.java:134) at java.base@17.0.4.1/java.lang.invoke.CallSite.makeSite(CallSite.java:315) at java.base@17.0.4.1/java.lang.invoke.MethodHandleNatives.linkCallSiteImpl(MethodHandleNatives.java:281) at java.base@17.0.4.1/java.lang.invoke.MethodHandleNatives.linkCallSite(MethodHandleNatives.java:271) at app//io.vertx.core.impl.future.CompositeFutureImpl.(CompositeFutureImpl.java:85) at app//io.vertx.core.CompositeFuture.join(CompositeFuture.java:176) at app//io.vertx.core.eventbus.impl.EventBusImpl.unregisterAll(EventBusImpl.java:439) at app//io.vertx.core.eventbus.impl.EventBusImpl.close(EventBusImpl.java:234) at app//io.vertx.core.impl.VertxImpl.lambda$null$12(VertxImpl.java:627) at app//io.vertx.core.impl.VertxImpl$$Lambda$1793/0x00000008018b9808.handle(Unknown Source) at app//io.vertx.core.impl.ContextInternal.dispatch(ContextInternal.java:264) at app//io.vertx.core.impl.ContextInternal.dispatch(ContextInternal.java:246) at app//io.vertx.core.impl.EventLoopContext.lambda$runOnContext$0(EventLoopContext.java:43) at app//io.vertx.core.impl.EventLoopContext$$Lambda$1641/0x00000008018731e0.run(Unknown Source) at app//io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174) at app//io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167) at app//io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470) at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569) at app//io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at app//io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.base@17.0.4.1/java.lang.Thread.run(Thread.java:833) ```
edeandrea commented 1 year ago

I'll pull this in locally and re-test to try to narrow down what the issue could be. Something in Quarkus changed, just have to figure out what.

quarkus-super-heroes-bot commented 1 year ago

The build is still failing:

quarkus-super-heroes-bot commented 1 year ago

Build fixed:

quarkus-super-heroes-bot commented 1 year ago

Unfortunately, the build failed:

edeandrea commented 1 year ago

@geoand /@gsmet this is due to the switch on Quarkus main to use the Jakarta namespace. This also means this job will continue to fail each and every day until Quarkus 3 is released and the superheroes has been upgraded to it.

We have a couple of options:

  1. Is there a different branch in the Quarkus repo that is like the "main", but only for Quarkus 2.x? If so we can switch this job to build that branch.
  2. Disable this job until Quarkus 3 is out.

Personally I'd prefer option 1.

Are there any options I didn't think of?

geoand commented 1 year ago

See an updated setup-and-test script: https://github.com/quarkusio/quarkus-ecosystem-ci/blob/f4d102f6dafdd4f2b72c8ef4b6b3ce59067a4862/setup-and-test#L11

edeandrea commented 1 year ago

How come it's not getting used then? It's pulling the Quarkus ecosystem ci....

https://github.com/quarkusio/quarkus-super-heroes/blob/main/.github/workflows/quarkus-snapshot.yaml

geoand commented 1 year ago

I'll let you answer that one by looking at the .github directory of your repo and / or the CI run logs 😉

edeandrea commented 1 year ago

It's pulling from 'main' on quarkus-ecosystem-ci....

That contains the script which checks out the Quarkus repo. What am I missing?

edeandrea commented 1 year ago

https://github.com/quarkusio/quarkus-super-heroes/blob/main/.github/workflows/quarkus-snapshot.yaml#L37_L47

edeandrea commented 1 year ago

@geoand I think I know what the problem is

https://github.com/quarkusio/quarkus-ecosystem-ci/blob/main/setup-and-test#L12

 if [[ "$(mvn help:evaluate -Dexpression=quarkus.version -q -DforceStdout -f ../../current-repo/pom.xml| cut -d. -f1)" = "2" ]]; then

this assumes the root of the project has a pom.xml at the root which declares the Quarkus version. In this case of the superheroes, this is not true. The superheroes has child projects, but is not itself multi-module.

Could we tweak the setup-and-test script a little bit to maybe check if there is a get-quarkus-version script in the project's repo, kind of like you do for this

# check the test script
if [ -f .github/quarkus-ecosystem-test ]; then
    echo "Test script found"
else
    echo "Test script not found - using default from quarkus-ecosystem-ci"
    cp ../ecosystem-ci/quarkus-ecosystem-test .github/quarkus-ecosystem-test
fi

and let the script return the version to use?

I could do a PR for that if you're ok with the approach.

geoand commented 1 year ago

I wonder if it might be easier to go down a level and try to find a pom

edeandrea commented 1 year ago

Thats what I"m thinking too, but the setup-and-test script would have to be injected with a path to follow - maybe via env var?

edeandrea commented 1 year ago

I don't think a generic setup-and-test script should try to determine inside the project which pom to use. I think it would be better to let the project inject that info into the script.

geoand commented 1 year ago

Yeah, that's probably the right thing to do

edeandrea commented 1 year ago

So for example, in quarkusio/quarkus-super-heroes/.github/workflows/quarkus-snapshot.yaml:

- name: Setup and Run Tests
  run: ./ecosystem-ci/setup-and-test
  env:
    ECOSYSTEM_CI_TOKEN: ${{ secrets.ECOSYSTEM_CI_TOKEN }}
    JAVA_17_PATH: ${{ env.JAVA_17_PATH }}
    QUARKUS_VERSION_PATH: rest-fights

Then in quarkusio/quarkus-ecosystem-ci/setup-and-test line 12 we would append the QUARKUS_VERSION_PATH if it existed to find pom.xml.

If you're ok with this approach I'll go ahead and do a PR.

edeandrea commented 1 year ago

also, the superheroes uses quarkus.platform.version, not quarkus.version in its pom.xml, so we'd need to override/inject that into the script as well

geoand commented 1 year ago

👍🏼

edeandrea commented 1 year ago

@geoand see quarkusio/quarkus-ecosystem-ci#119

edeandrea commented 1 year ago

This should resolve itself tomorrow.

quarkus-super-heroes-bot commented 1 year ago

Build fixed:

quarkus-super-heroes-bot commented 1 year ago

Unfortunately, the build failed:

edeandrea commented 1 year ago

This failure is my fault. It is a side effect from quarkusio/quarkus-super-heroes#283. I will fix it today.

quarkus-super-heroes-bot commented 1 year ago

Build fixed:

quarkus-super-heroes-bot commented 1 year ago

Unfortunately, the build failed:

edeandrea commented 1 year ago

I feel like this is just some weirdness on the CI machine. Kafka Dev Services never was able to start a Kafka container. I'll let it go until tomorrow and see if it resolves itself.

quarkus-super-heroes-bot commented 1 year ago

Build fixed:

quarkus-super-heroes-bot commented 10 months ago

Unfortunately, the build failed:

edeandrea commented 10 months ago

Looks like the build of quarkus main itself failed. We'll see if this fixes itself tomorrow.

quarkus-super-heroes-bot commented 10 months ago

Build fixed:

quarkus-super-heroes-bot commented 9 months ago

Unfortunately, the build failed: