microservices-patterns / ftgo-application

Example code for the book Microservice patterns
Other
3.46k stars 1.32k forks source link

Trying to run this application on Windows. I see none of the swagger UIs are accessible(The site cannot be reached) #85

Closed prakashid2 closed 4 years ago

prakashid2 commented 4 years ago

Hi.

Today I correctly followed the instructions given in README.adoc step by step. I'm trying to run the application on my local laptop i.e. on windows. Everything worked fine except some deprecation warnings. The last command I entered as below. This gave following output

ftgo-application-master> docker-compose up -d . . Creating ftgo-application-master_ftgo-api-gateway_1 ... done Creating ftgo-application-master_mysql_1 ... done Creating ftgo-application-master_zookeeper_1 ... done Creating ftgo-application-master_dynamodblocal_1 ... done Creating ftgo-application-master_zipkin_1 ... done Creating ftgo-application-master_dynamodblocal-init_1 ... done Creating ftgo-application-master_kafka_1 ... done Creating ftgo-application-master_cdc-service_1 ... done Creating ftgo-application-master_ftgo-accounting-service_1 ... done Creating ftgo-application-master_ftgo-order-service_1 ... done Creating ftgo-application-master_ftgo-restaurant-service_1 ... done Creating ftgo-application-master_ftgo-consumer-service_1 ... done Creating ftgo-application-master_ftgo-kitchen-service_1 ... done Creating ftgo-application-master_ftgo-order-history-service_1 ... done

ftgo-application-master>

Now as per the README document I tried accessing swagger UIs. But I found none of the swagger UIs are accessible. It simply says "This site can’t be reached. 192.168.99.100 refused to connect."(in my case DOCKER_HOST_IP is 192.168.99.100). I don't know why this is not working.

There is one more URL given in README.adoc which is for accessing the services via api gateway as below http://${DOCKER_HOST_IP?}:8087

In my case it is http://192.168.99.100:8087/ -This gives a Whitelabel Error Page. This means api gateway is started. Can you let me know how to access the running service via api gateway. I need the full url. Also give a sample json to invoke one service let's say 'Create Consumer'.

Since the Swagger UI's not accessible I'm a bit handicapped and have to rely on api gateway url for accessing the individual service and also the json.

Thanks Prakash S. Mumbai, India

cer commented 4 years ago

What is the output of docker ps -a?

prakashid2 commented 4 years ago

What is the output of docker ps -a?

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>docker ps -a CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES d6a0622db806 ftgo-application-master_ftgo-order-history-service "/bin/sh -c 'java ${" 3 hours ago Exited (1) 2 hours ago ftgo-application-master_ftgo-order-history-service_1 fb26b4c2c593 ftgo-application-master_ftgo-kitchen-service "/bin/sh -c 'java ${" 3 hours ago Exited (1) 2 hours ago ftgo-application-master_ftgo-kitchen-service_1 765c0049d14f ftgo-application-master_ftgo-consumer-service "/bin/sh -c 'java ${" 3 hours ago Exited (1) 2 hours ago ftgo-application-master_ftgo-consumer-service_1 39b4be440915 ftgo-application-master_ftgo-order-service "/bin/sh -c 'java ${" 3 hours ago Exited (1) 2 hours ago ftgo-application-master_ftgo-order-service_1 e11b2a5f1bf1 ftgo-application-master_ftgo-accounting-service "/bin/sh -c 'java ${" 3 hours ago Exited (1) 2 hours ago ftgo-application-master_ftgo-accounting-service_1 14278276b42e ftgo-application-master_ftgo-restaurant-service "/bin/sh -c 'java ${" 3 hours ago Exited (137) 2 hours ago ftgo-application-master_ftgo-restaurant-service_1 32e478759f46 eventuateio/eventuate-cdc-service:0.4.0.RELEASE "/bin/sh -c 'java ${" 3 hours ago Exited (1) 2 hours ago ftgo-application-master_cdc-service_1 47110c01a6a7 eventuateio/eventuate-kafka:0.3.0.RELEASE "/bin/bash -c ./run-" 3 hours ago Exited (137) 2 hours ago ftgo-application-master_kafka_1 7b6d9f6411f0 ftgo-application-master_dynamodblocal-init "/bin/sh -c './wait-" 3 hours ago Up 3 hours (healthy) ftgo-application-master_dynamodblocal-init_1 37f828ac67e0 openzipkin/zipkin:2.5.0 "/bin/sh -c 'test -n" 3 hours ago Up 3 hours 9410/t cp, 0.0.0.0:9411->9411/tcp ftgo-application-master_zipkin_1 fcd514af0edd eventuateio/eventuate-zookeeper:0.4.0.RELEASE "/usr/local/zookeepe" 3 hours ago Up 3 hours 2888/t cp, 0.0.0.0:2181->2181/tcp, 3888/tcp ftgo-application-master_zookeeper_1 4f61d23cafb7 ftgo-application-master_mysql "docker-entrypoint.s" 3 hours ago Up 3 hours 0.0.0. 0:3306->3306/tcp ftgo-application-master_mysql_1 cdb12f2404fd ftgo-application-master_dynamodblocal "/bin/sh -c 'java -j" 3 hours ago Up 3 hours (healthy) 0.0.0. 0:8000->8000/tcp ftgo-application-master_dynamodblocal_1 dc91e5a7b938 ftgo-application-master_ftgo-api-gateway "/bin/sh -c 'java ${" 3 hours ago Up 3 hours (healthy) 0.0.0. 0:8087->8080/tcp ftgo-application-master_ftgo-api-gateway_1 f6d7fd1eea2c openzipkin/zipkin "/busybox/sh run.sh" 11 days ago Created kind_babbage 2a1513a0c6e3 openzipkin/zipkin "/busybox/sh run.sh" 11 days ago Exited (255) 4 hours ago 9410/ tcp, 0.0.0.0:9411->9411/tcp kind_hellman 05e99c400e01 prakashmum/security-simple:v1 "/bin/sh -c 'java -j" 2 weeks ago Exited (255) 11 days ago 0.0.0. 0:8086->8086/tcp hopeful_merkle 2a7b4ec1b962 docker-spring-boot "java jar docker-spr" 4 months ago Exited (1) 4 months ago dazzling_elgamal fcaedbbe0531 docker-spring-boot "java jar docker-spr" 4 months ago Exited (1) 4 months ago ecstatic_gagarin

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>

cer commented 4 years ago

It looks like ftgo-application-master_kafka_1 and everything that depends upon it has exited.

What's the output of docker logs ftgo-application-master_kafka_1?

prakashid2 commented 4 years ago

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>docker logs ftgo-application-master_kafka_1 ZOOKEEPER_CONNECTION_TIMEOUT_MS is not set. Setting to 6000 ADVERTISED_HOST_NAME=192.168.99.100 /usr/local/kafka-config/server.properties -> ./config/server.properties [2020-04-15 13:10:23,385] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$) [2020-04-15 13:10:38,647] INFO starting (kafka.server.KafkaServer) [2020-04-15 13:10:38,661] INFO Connecting to zookeeper on zookeeper:2181 (kafka.server.KafkaServer) [2020-04-15 13:10:38,936] INFO [ZooKeeperClient] Initializing a new session to zookeeper:2181. (kafka.zookeeper.ZooKeeperClient) [2020-04-15 13:10:39,027] INFO Client environment:zookeeper.version=3.4.10-39d3a4f269333c922ed3db283be479f9deacaa0f, built on 03/23/2017 10:13 GMT (or g.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,029] INFO Client environment:host.name=47110c01a6a7 (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,030] INFO Client environment:java.version=1.8.0_91 (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,031] INFO Client environment:java.vendor=Oracle Corporation (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,031] INFO Client environment:java.home=/usr/lib/jvm/java-8-openjdk-amd64/jre (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,031] INFO Client environment:java.class.path=/usr/local/kafka_2.12-1.1.0/bin/../libs/aopalliance-repackaged-2.5.0-b32.jar:/usr/lo cal/kafka_2.12-1.1.0/bin/../libs/argparse4j-0.7.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/commons-lang3-3.5.jar:/usr/local/kafka_2.12-1.1.0/bin/.. /libs/connect-api-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/connect-file-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/connect-json-1.1.0. jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/connect-runtime-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/connect-transforms-1.1.0.jar:/usr/local/ kafka_2.12-1.1.0/bin/../libs/guava-20.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/hk2-api-2.5.0-b32.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/hk2- locator-2.5.0-b32.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/hk2-utils-2.5.0-b32.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jackson-annotations-2.9. 4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jackson-core-2.9.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jackson-databind-2.9.4.jar:/usr/local/kaf ka_2.12-1.1.0/bin/../libs/jackson-jaxrs-base-2.9.4.jar:/usr/local/kafka2.12-1.1.0/bin/../libs/jackson-jaxrs-json-provider-2.9.4.jar:/usr/local/kafka 2.12-1.1.0/bin/../libs/jackson-module-jaxb-annotations-2.9.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/javassist-3.20.0-GA.jar:/usr/local/kafka_2.12 -1.1.0/bin/../libs/javassist-3.21.0-GA.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/javax.annotation-api-1.2.jar:/usr/local/kafka_2.12-1.1.0/bin/../lib s/javax.inject-1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/javax.inject-2.5.0-b32.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/javax.servlet-api-3.1. 0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/javax.ws.rs-api-2.0.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jersey-client-2.25.1.jar:/usr/local/ka fka_2.12-1.1.0/bin/../libs/jersey-common-2.25.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jersey-container-servlet-2.25.1.jar:/usr/local/kafka_2.12- 1.1.0/bin/../libs/jersey-container-servlet-core-2.25.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jersey-guava-2.25.1.jar:/usr/local/kafka_2.12-1.1.0 /bin/../libs/jersey-media-jaxb-2.25.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jersey-server-2.25.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jet ty-client-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-continuation-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs /jetty-http-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-io-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-s ecurity-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-server-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-s ervlet-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-servlets-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty- util-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jopt-simple-5.0.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka-clients-1.1.0.ja r:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka-log4j-appender-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka-streams-1.1.0.jar:/usr/local/ka fka_2.12-1.1.0/bin/../libs/kafka-streams-examples-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka-streams-test-utils-1.1.0.jar:/usr/local/kafk a_2.12-1.1.0/bin/../libs/kafka-tools-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka_2.12-1.1.0-sources.jar:/usr/local/kafka_2.12-1.1.0/bin/.. /libs/kafka_2.12-1.1.0-test-sources.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka_2.12-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/log4j-1.2 .17.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/lz4-java-1.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/maven-artifact-3.5.2.jar:/usr/local/kafka_2.1 2-1.1.0/bin/../libs/metrics-core-2.2.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/osgi-resource-locator-1.0.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../ libs/plexus-utils-3.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/reflections-0.9.11.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/rocksdbjni-5.7.3.ja r:/usr/local/kafka_2.12-1.1.0/bin/../libs/scala-library-2.12.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/scala-logging_2.12-3.7.2.jar:/usr/local/kaf ka_2.12-1.1.0/bin/../libs/scala-reflect-2.12.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/slf4j-api-1.7.25.jar:/usr/local/kafka_2.12-1.1.0/bin/../lib s/slf4j-log4j12-1.7.25.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/snappy-java-1.1.7.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/validation-api-1.1. 0.Final.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/zkclient-0.10.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/zookeeper-3.4.10.jar (org.apache.zookeep er.ZooKeeper) [2020-04-15 13:10:39,037] INFO Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/ usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,038] INFO Client environment:java.io.tmpdir=/tmp (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,039] INFO Client environment:java.compiler= (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,041] INFO Client environment:os.name=Linux (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,043] INFO Client environment:os.arch=amd64 (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,044] INFO Client environment:os.version=4.14.154-boot2docker (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,046] INFO Client environment:user.name=root (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,047] INFO Client environment:user.home=/root (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,048] INFO Client environment:user.dir=/usr/local/kafka_2.12-1.1.0 (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,057] INFO Initiating client connection, connectString=zookeeper:2181 sessionTimeout=6000 watcher=kafka.zookeeper.ZooKeeperClient$ ZooKeeperClientWatcher$@be64738 (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:10:39,600] INFO [ZooKeeperClient] Waiting until connected. (kafka.zookeeper.ZooKeeperClient) [2020-04-15 13:10:39,616] INFO Opening socket connection to server ftgo-application-master_zookeeper_1.ftgo-application-master_default/172.18.0.5:2181 . Will not attempt to authenticate using SASL (unknown error) (org.apache.zookeeper.ClientCnxn) [2020-04-15 13:10:39,964] INFO Socket connection established to ftgo-application-master_zookeeper_1.ftgo-application-master_default/172.18.0.5:2181, i nitiating session (org.apache.zookeeper.ClientCnxn) [2020-04-15 13:10:43,601] INFO Session establishment complete on server ftgo-application-master_zookeeper_1.ftgo-application-master_default/172.18.0.5 :2181, sessionid = 0x1717df621f00000, negotiated timeout = 6000 (org.apache.zookeeper.ClientCnxn) [2020-04-15 13:10:43,661] INFO [ZooKeeperClient] Connected. (kafka.zookeeper.ZooKeeperClient) [2020-04-15 13:10:50,199] INFO Cluster ID = Es9-wGqqR_2tQoIR65NhiQ (kafka.server.KafkaServer) [2020-04-15 13:10:50,370] WARN No meta.properties file under dir /tmp/kafka-logs/meta.properties (kafka.server.BrokerMetadataCheckpoint) [2020-04-15 13:10:53,016] INFO KafkaConfig values: advertised.host.name = null advertised.listeners = PLAINTEXT://192.168.99.100:9092 advertised.port = null alter.config.policy.class.name = null alter.log.dirs.replication.quota.window.num = 11 alter.log.dirs.replication.quota.window.size.seconds = 1 authorizer.class.name = auto.create.topics.enable = true auto.leader.rebalance.enable = true background.threads = 10 broker.id = 0 broker.id.generation.enable = true broker.rack = null compression.type = producer connections.max.idle.ms = 600000 controlled.shutdown.enable = true controlled.shutdown.max.retries = 3 controlled.shutdown.retry.backoff.ms = 5000 controller.socket.timeout.ms = 30000 create.topic.policy.class.name = null default.replication.factor = 1 delegation.token.expiry.check.interval.ms = 3600000 delegation.token.expiry.time.ms = 86400000 delegation.token.master.key = null delegation.token.max.lifetime.ms = 604800000 delete.records.purgatory.purge.interval.requests = 1 delete.topic.enable = true fetch.purgatory.purge.interval.requests = 1000 group.initial.rebalance.delay.ms = 3000 group.max.session.timeout.ms = 300000 group.min.session.timeout.ms = 6000 host.name = inter.broker.listener.name = null inter.broker.protocol.version = 1.1-IV0 leader.imbalance.check.interval.seconds = 300 leader.imbalance.per.broker.percentage = 10 listener.security.protocol.map = PLAINTEXT:PLAINTEXT,SSL:SSL,SASL_PLAINTEXT:SASL_PLAINTEXT,SASL_SSL:SASL_SSL listeners = null log.cleaner.backoff.ms = 15000 log.cleaner.dedupe.buffer.size = 134217728 log.cleaner.delete.retention.ms = 86400000 log.cleaner.enable = true log.cleaner.io.buffer.load.factor = 0.9 log.cleaner.io.buffer.size = 524288 log.cleaner.io.max.bytes.per.second = 1.7976931348623157E308 log.cleaner.min.cleanable.ratio = 0.5 log.cleaner.min.compaction.lag.ms = 0 log.cleaner.threads = 1 log.cleanup.policy = [delete] log.dir = /tmp/kafka-logs log.dirs = /tmp/kafka-logs log.flush.interval.messages = 9223372036854775807 log.flush.interval.ms = null log.flush.offset.checkpoint.interval.ms = 60000 log.flush.scheduler.interval.ms = 9223372036854775807 log.flush.start.offset.checkpoint.interval.ms = 60000 log.index.interval.bytes = 4096 log.index.size.max.bytes = 10485760 log.message.format.version = 1.1-IV0 log.message.timestamp.difference.max.ms = 9223372036854775807 log.message.timestamp.type = CreateTime log.preallocate = false log.retention.bytes = -1 log.retention.check.interval.ms = 300000 log.retention.hours = 168 log.retention.minutes = null log.retention.ms = null log.roll.hours = 168 log.roll.jitter.hours = 0 log.roll.jitter.ms = null log.roll.ms = null log.segment.bytes = 1073741824 log.segment.delete.delay.ms = 60000 max.connections.per.ip = 2147483647 max.connections.per.ip.overrides = max.incremental.fetch.session.cache.slots = 1000 message.max.bytes = 1000012 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 min.insync.replicas = 1 num.io.threads = 8 num.network.threads = 3 num.partitions = 2 num.recovery.threads.per.data.dir = 1 num.replica.alter.log.dirs.threads = null num.replica.fetchers = 1 offset.metadata.max.bytes = 4096 offsets.commit.required.acks = -1 offsets.commit.timeout.ms = 5000 offsets.load.buffer.size = 5242880 offsets.retention.check.interval.ms = 600000 offsets.retention.minutes = 1440 offsets.topic.compression.codec = 0 offsets.topic.num.partitions = 50 offsets.topic.replication.factor = 1 offsets.topic.segment.bytes = 104857600 password.encoder.cipher.algorithm = AES/CBC/PKCS5Padding password.encoder.iterations = 4096 password.encoder.key.length = 128 password.encoder.keyfactory.algorithm = null password.encoder.old.secret = null password.encoder.secret = null port = 9092 principal.builder.class = null producer.purgatory.purge.interval.requests = 1000 queued.max.request.bytes = -1 queued.max.requests = 500 quota.consumer.default = 9223372036854775807 quota.producer.default = 9223372036854775807 quota.window.num = 11 quota.window.size.seconds = 1 replica.fetch.backoff.ms = 1000 replica.fetch.max.bytes = 1048576 replica.fetch.min.bytes = 1 replica.fetch.response.max.bytes = 10485760 replica.fetch.wait.max.ms = 500 replica.high.watermark.checkpoint.interval.ms = 5000 replica.lag.time.max.ms = 10000 replica.socket.receive.buffer.bytes = 65536 replica.socket.timeout.ms = 30000 replication.quota.window.num = 11 replication.quota.window.size.seconds = 1 request.timeout.ms = 30000 reserved.broker.max.id = 1000 sasl.enabled.mechanisms = [GSSAPI] sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.principal.to.local.rules = [DEFAULT] sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism.inter.broker.protocol = GSSAPI security.inter.broker.protocol = PLAINTEXT socket.receive.buffer.bytes = 102400 socket.request.max.bytes = 104857600 socket.send.buffer.bytes = 102400 ssl.cipher.suites = [] ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.abort.timed.out.transaction.cleanup.interval.ms = 60000 transaction.max.timeout.ms = 900000 transaction.remove.expired.transaction.cleanup.interval.ms = 3600000 transaction.state.log.load.buffer.size = 5242880 transaction.state.log.min.isr = 2 transaction.state.log.num.partitions = 50 transaction.state.log.replication.factor = 3 transaction.state.log.segment.bytes = 104857600 transactional.id.expiration.ms = 604800000 unclean.leader.election.enable = false zookeeper.connect = zookeeper:2181 zookeeper.connection.timeout.ms = 6000 zookeeper.max.in.flight.requests = 10 zookeeper.session.timeout.ms = 6000 zookeeper.set.acl = false zookeeper.sync.time.ms = 2000 (kafka.server.KafkaConfig) [2020-04-15 13:10:54,078] INFO KafkaConfig values: advertised.host.name = null advertised.listeners = PLAINTEXT://192.168.99.100:9092 advertised.port = null alter.config.policy.class.name = null alter.log.dirs.replication.quota.window.num = 11 alter.log.dirs.replication.quota.window.size.seconds = 1 authorizer.class.name = auto.create.topics.enable = true auto.leader.rebalance.enable = true background.threads = 10 broker.id = 0 broker.id.generation.enable = true broker.rack = null compression.type = producer connections.max.idle.ms = 600000 controlled.shutdown.enable = true controlled.shutdown.max.retries = 3 controlled.shutdown.retry.backoff.ms = 5000 controller.socket.timeout.ms = 30000 create.topic.policy.class.name = null default.replication.factor = 1 delegation.token.expiry.check.interval.ms = 3600000 delegation.token.expiry.time.ms = 86400000 delegation.token.master.key = null delegation.token.max.lifetime.ms = 604800000 delete.records.purgatory.purge.interval.requests = 1 delete.topic.enable = true fetch.purgatory.purge.interval.requests = 1000 group.initial.rebalance.delay.ms = 3000 group.max.session.timeout.ms = 300000 group.min.session.timeout.ms = 6000 host.name = inter.broker.listener.name = null inter.broker.protocol.version = 1.1-IV0 leader.imbalance.check.interval.seconds = 300 leader.imbalance.per.broker.percentage = 10 listener.security.protocol.map = PLAINTEXT:PLAINTEXT,SSL:SSL,SASL_PLAINTEXT:SASL_PLAINTEXT,SASL_SSL:SASL_SSL listeners = null log.cleaner.backoff.ms = 15000 log.cleaner.dedupe.buffer.size = 134217728 log.cleaner.delete.retention.ms = 86400000 log.cleaner.enable = true log.cleaner.io.buffer.load.factor = 0.9 log.cleaner.io.buffer.size = 524288 log.cleaner.io.max.bytes.per.second = 1.7976931348623157E308 log.cleaner.min.cleanable.ratio = 0.5 log.cleaner.min.compaction.lag.ms = 0 log.cleaner.threads = 1 log.cleanup.policy = [delete] log.dir = /tmp/kafka-logs log.dirs = /tmp/kafka-logs log.flush.interval.messages = 9223372036854775807 log.flush.interval.ms = null log.flush.offset.checkpoint.interval.ms = 60000 log.flush.scheduler.interval.ms = 9223372036854775807 log.flush.start.offset.checkpoint.interval.ms = 60000 log.index.interval.bytes = 4096 log.index.size.max.bytes = 10485760 log.message.format.version = 1.1-IV0 log.message.timestamp.difference.max.ms = 9223372036854775807 log.message.timestamp.type = CreateTime log.preallocate = false log.retention.bytes = -1 log.retention.check.interval.ms = 300000 log.retention.hours = 168 log.retention.minutes = null log.retention.ms = null log.roll.hours = 168 log.roll.jitter.hours = 0 log.roll.jitter.ms = null log.roll.ms = null log.segment.bytes = 1073741824 log.segment.delete.delay.ms = 60000 max.connections.per.ip = 2147483647 max.connections.per.ip.overrides = max.incremental.fetch.session.cache.slots = 1000 message.max.bytes = 1000012 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 min.insync.replicas = 1 num.io.threads = 8 num.network.threads = 3 num.partitions = 2 num.recovery.threads.per.data.dir = 1 num.replica.alter.log.dirs.threads = null num.replica.fetchers = 1 offset.metadata.max.bytes = 4096 offsets.commit.required.acks = -1 offsets.commit.timeout.ms = 5000 offsets.load.buffer.size = 5242880 offsets.retention.check.interval.ms = 600000 offsets.retention.minutes = 1440 offsets.topic.compression.codec = 0 offsets.topic.num.partitions = 50 offsets.topic.replication.factor = 1 offsets.topic.segment.bytes = 104857600 password.encoder.cipher.algorithm = AES/CBC/PKCS5Padding password.encoder.iterations = 4096 password.encoder.key.length = 128 password.encoder.keyfactory.algorithm = null password.encoder.old.secret = null password.encoder.secret = null port = 9092 principal.builder.class = null producer.purgatory.purge.interval.requests = 1000 queued.max.request.bytes = -1 queued.max.requests = 500 quota.consumer.default = 9223372036854775807 quota.producer.default = 9223372036854775807 quota.window.num = 11 quota.window.size.seconds = 1 replica.fetch.backoff.ms = 1000 replica.fetch.max.bytes = 1048576 replica.fetch.min.bytes = 1 replica.fetch.response.max.bytes = 10485760 replica.fetch.wait.max.ms = 500 replica.high.watermark.checkpoint.interval.ms = 5000 replica.lag.time.max.ms = 10000 replica.socket.receive.buffer.bytes = 65536 replica.socket.timeout.ms = 30000 replication.quota.window.num = 11 replication.quota.window.size.seconds = 1 request.timeout.ms = 30000 reserved.broker.max.id = 1000 sasl.enabled.mechanisms = [GSSAPI] sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.principal.to.local.rules = [DEFAULT] sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism.inter.broker.protocol = GSSAPI security.inter.broker.protocol = PLAINTEXT socket.receive.buffer.bytes = 102400 socket.request.max.bytes = 104857600 socket.send.buffer.bytes = 102400 ssl.cipher.suites = [] ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.abort.timed.out.transaction.cleanup.interval.ms = 60000 transaction.max.timeout.ms = 900000 transaction.remove.expired.transaction.cleanup.interval.ms = 3600000 transaction.state.log.load.buffer.size = 5242880 transaction.state.log.min.isr = 2 transaction.state.log.num.partitions = 50 transaction.state.log.replication.factor = 3 transaction.state.log.segment.bytes = 104857600 transactional.id.expiration.ms = 604800000 unclean.leader.election.enable = false zookeeper.connect = zookeeper:2181 zookeeper.connection.timeout.ms = 6000 zookeeper.max.in.flight.requests = 10 zookeeper.session.timeout.ms = 6000 zookeeper.set.acl = false zookeeper.sync.time.ms = 2000 (kafka.server.KafkaConfig)

[2020-04-15 13:10:57,031] INFO Log directory '/tmp/kafka-logs' not found, creating it. (kafka.log.LogManager) [2020-04-15 13:10:57,426] INFO Loading logs. (kafka.log.LogManager) [2020-04-15 13:10:57,762] INFO Logs loading complete in 326 ms. (kafka.log.LogManager) [2020-04-15 13:10:58,096] INFO Starting log cleanup with a period of 300000 ms. (kafka.log.LogManager) [2020-04-15 13:10:58,607] INFO Starting log flusher with a default period of 9223372036854775807 ms. (kafka.log.LogManager) [2020-04-15 13:11:19,282] INFO Awaiting socket connections on 0.0.0.0:9092. (kafka.network.Acceptor) [2020-04-15 13:11:21,171] INFO [SocketServer brokerId=0] Started 1 acceptor threads (kafka.network.SocketServer)

[2020-04-15 13:11:23,702] INFO Creating /brokers/ids/0 (is it secure? false) (kafka.zk.KafkaZkClient) [2020-04-15 13:11:23,858] INFO Result of znode creation at /brokers/ids/0 is: OK (kafka.zk.KafkaZkClient) [2020-04-15 13:11:23,895] INFO Registered broker 0 at path /brokers/ids/0 with addresses: ArrayBuffer(EndPoint(192.168.99.100,9092,ListenerName(PLAINT EXT),PLAINTEXT)) (kafka.zk.KafkaZkClient) [2020-04-15 13:11:23,955] WARN No meta.properties file under dir /tmp/kafka-logs/meta.properties (kafka.server.BrokerMetadataCheckpoint)

[2020-04-15 13:11:25,432] INFO Creating /controller (is it secure? false) (kafka.zk.KafkaZkClient)

[2020-04-15 13:11:25,534] INFO Result of znode creation at /controller is: OK (kafka.zk.KafkaZkClient) [2020-04-15 13:11:26,155] INFO [GroupCoordinator 0]: Starting up. (kafka.coordinator.group.GroupCoordinator) [2020-04-15 13:11:26,193] INFO [GroupCoordinator 0]: Startup complete. (kafka.coordinator.group.GroupCoordinator) [2020-04-15 13:11:26,417] INFO [GroupMetadataManager brokerId=0] Removed 0 expired offsets in 220 milliseconds. (kafka.coordinator.group.GroupMetadata Manager) [2020-04-15 13:11:26,755] INFO [ProducerId Manager 0]: Acquired new producerId block (brokerId:0,blockStartProducerId:0,blockEndProducerId:999) by wri ting to Zk with path version 1 (kafka.coordinator.transaction.ProducerIdManager) [2020-04-15 13:11:28,118] INFO [TransactionCoordinator id=0] Starting up. (kafka.coordinator.transaction.TransactionCoordinator) [2020-04-15 13:11:28,242] INFO [TransactionCoordinator id=0] Startup complete. (kafka.coordinator.transaction.TransactionCoordinator)

[2020-04-15 13:11:33,017] INFO [/config/changes-event-process-thread]: Starting (kafka.common.ZkNodeChangeNotificationListener$ChangeEventProcessThrea d) [2020-04-15 13:11:33,789] INFO Kafka version : 1.1.0 (org.apache.kafka.common.utils.AppInfoParser) [2020-04-15 13:11:33,870] INFO Kafka commitId : fdcf75ea326b8e07 (org.apache.kafka.common.utils.AppInfoParser) [2020-04-15 13:11:33,913] INFO [KafkaServer id=0] started (kafka.server.KafkaServer) ZOOKEEPER_CONNECTION_TIMEOUT_MS is not set. Setting to 6000 ADVERTISED_HOST_NAME=192.168.99.100 /usr/local/kafka-config/server.properties -> ./config/server.properties [2020-04-15 13:58:16,223] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$) [2020-04-15 13:58:26,097] INFO starting (kafka.server.KafkaServer) [2020-04-15 13:58:26,110] INFO Connecting to zookeeper on zookeeper:2181 (kafka.server.KafkaServer) [2020-04-15 13:58:26,387] INFO [ZooKeeperClient] Initializing a new session to zookeeper:2181. (kafka.zookeeper.ZooKeeperClient) [2020-04-15 13:58:26,448] INFO Client environment:zookeeper.version=3.4.10-39d3a4f269333c922ed3db283be479f9deacaa0f, built on 03/23/2017 10:13 GMT (or g.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,448] INFO Client environment:host.name=47110c01a6a7 (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,450] INFO Client environment:java.version=1.8.0_91 (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,450] INFO Client environment:java.vendor=Oracle Corporation (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,451] INFO Client environment:java.home=/usr/lib/jvm/java-8-openjdk-amd64/jre (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,452] INFO Client environment:java.class.path=/usr/local/kafka_2.12-1.1.0/bin/../libs/aopalliance-repackaged-2.5.0-b32.jar:/usr/lo cal/kafka_2.12-1.1.0/bin/../libs/argparse4j-0.7.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/commons-lang3-3.5.jar:/usr/local/kafka_2.12-1.1.0/bin/.. /libs/connect-api-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/connect-file-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/connect-json-1.1.0. jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/connect-runtime-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/connect-transforms-1.1.0.jar:/usr/local/ kafka_2.12-1.1.0/bin/../libs/guava-20.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/hk2-api-2.5.0-b32.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/hk2- locator-2.5.0-b32.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/hk2-utils-2.5.0-b32.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jackson-annotations-2.9. 4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jackson-core-2.9.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jackson-databind-2.9.4.jar:/usr/local/kaf ka_2.12-1.1.0/bin/../libs/jackson-jaxrs-base-2.9.4.jar:/usr/local/kafka2.12-1.1.0/bin/../libs/jackson-jaxrs-json-provider-2.9.4.jar:/usr/local/kafka 2.12-1.1.0/bin/../libs/jackson-module-jaxb-annotations-2.9.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/javassist-3.20.0-GA.jar:/usr/local/kafka_2.12 -1.1.0/bin/../libs/javassist-3.21.0-GA.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/javax.annotation-api-1.2.jar:/usr/local/kafka_2.12-1.1.0/bin/../lib s/javax.inject-1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/javax.inject-2.5.0-b32.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/javax.servlet-api-3.1. 0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/javax.ws.rs-api-2.0.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jersey-client-2.25.1.jar:/usr/local/ka fka_2.12-1.1.0/bin/../libs/jersey-common-2.25.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jersey-container-servlet-2.25.1.jar:/usr/local/kafka_2.12- 1.1.0/bin/../libs/jersey-container-servlet-core-2.25.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jersey-guava-2.25.1.jar:/usr/local/kafka_2.12-1.1.0 /bin/../libs/jersey-media-jaxb-2.25.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jersey-server-2.25.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jet ty-client-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-continuation-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs /jetty-http-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-io-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-s ecurity-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-server-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-s ervlet-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty-servlets-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jetty- util-9.2.24.v20180105.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/jopt-simple-5.0.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka-clients-1.1.0.ja r:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka-log4j-appender-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka-streams-1.1.0.jar:/usr/local/ka fka_2.12-1.1.0/bin/../libs/kafka-streams-examples-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka-streams-test-utils-1.1.0.jar:/usr/local/kafk a_2.12-1.1.0/bin/../libs/kafka-tools-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka_2.12-1.1.0-sources.jar:/usr/local/kafka_2.12-1.1.0/bin/.. /libs/kafka_2.12-1.1.0-test-sources.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/kafka_2.12-1.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/log4j-1.2 .17.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/lz4-java-1.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/maven-artifact-3.5.2.jar:/usr/local/kafka_2.1 2-1.1.0/bin/../libs/metrics-core-2.2.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/osgi-resource-locator-1.0.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../ libs/plexus-utils-3.1.0.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/reflections-0.9.11.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/rocksdbjni-5.7.3.ja r:/usr/local/kafka_2.12-1.1.0/bin/../libs/scala-library-2.12.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/scala-logging_2.12-3.7.2.jar:/usr/local/kaf ka_2.12-1.1.0/bin/../libs/scala-reflect-2.12.4.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/slf4j-api-1.7.25.jar:/usr/local/kafka_2.12-1.1.0/bin/../lib s/slf4j-log4j12-1.7.25.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/snappy-java-1.1.7.1.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/validation-api-1.1. 0.Final.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/zkclient-0.10.jar:/usr/local/kafka_2.12-1.1.0/bin/../libs/zookeeper-3.4.10.jar (org.apache.zookeep er.ZooKeeper) [2020-04-15 13:58:26,454] INFO Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/ usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,457] INFO Client environment:java.io.tmpdir=/tmp (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,458] INFO Client environment:java.compiler= (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,459] INFO Client environment:os.name=Linux (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,460] INFO Client environment:os.arch=amd64 (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,460] INFO Client environment:os.version=4.14.154-boot2docker (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,461] INFO Client environment:user.name=root (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,462] INFO Client environment:user.home=/root (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,462] INFO Client environment:user.dir=/usr/local/kafka_2.12-1.1.0 (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,467] INFO Initiating client connection, connectString=zookeeper:2181 sessionTimeout=6000 watcher=kafka.zookeeper.ZooKeeperClient$ ZooKeeperClientWatcher$@be64738 (org.apache.zookeeper.ZooKeeper) [2020-04-15 13:58:26,739] INFO [ZooKeeperClient] Waiting until connected. (kafka.zookeeper.ZooKeeperClient) [2020-04-15 13:58:26,801] INFO Opening socket connection to server ftgo-application-master_zookeeper_1.ftgo-application-master_default/172.18.0.5:2181 . Will not attempt to authenticate using SASL (unknown error) (org.apache.zookeeper.ClientCnxn) [2020-04-15 13:58:26,932] INFO Socket connection established to ftgo-application-master_zookeeper_1.ftgo-application-master_default/172.18.0.5:2181, i nitiating session (org.apache.zookeeper.ClientCnxn) [2020-04-15 13:58:27,578] INFO Session establishment complete on server ftgo-application-master_zookeeper_1.ftgo-application-master_default/172.18.0.5 :2181, sessionid = 0x1717df621f00002, negotiated timeout = 6000 (org.apache.zookeeper.ClientCnxn) [2020-04-15 13:58:27,800] INFO [ZooKeeperClient] Connected. (kafka.zookeeper.ZooKeeperClient) [2020-04-15 13:58:31,407] INFO Cluster ID = Es9-wGqqR_2tQoIR65NhiQ (kafka.server.KafkaServer) [2020-04-15 13:58:32,744] INFO KafkaConfig values: advertised.host.name = null advertised.listeners = PLAINTEXT://192.168.99.100:9092 advertised.port = null alter.config.policy.class.name = null alter.log.dirs.replication.quota.window.num = 11 alter.log.dirs.replication.quota.window.size.seconds = 1 authorizer.class.name = auto.create.topics.enable = true auto.leader.rebalance.enable = true background.threads = 10 broker.id = 0 broker.id.generation.enable = true broker.rack = null compression.type = producer connections.max.idle.ms = 600000 controlled.shutdown.enable = true controlled.shutdown.max.retries = 3 controlled.shutdown.retry.backoff.ms = 5000 controller.socket.timeout.ms = 30000 create.topic.policy.class.name = null default.replication.factor = 1 delegation.token.expiry.check.interval.ms = 3600000 delegation.token.expiry.time.ms = 86400000 delegation.token.master.key = null delegation.token.max.lifetime.ms = 604800000 delete.records.purgatory.purge.interval.requests = 1 delete.topic.enable = true fetch.purgatory.purge.interval.requests = 1000 group.initial.rebalance.delay.ms = 3000 group.max.session.timeout.ms = 300000 group.min.session.timeout.ms = 6000 host.name = inter.broker.listener.name = null inter.broker.protocol.version = 1.1-IV0 leader.imbalance.check.interval.seconds = 300 leader.imbalance.per.broker.percentage = 10 listener.security.protocol.map = PLAINTEXT:PLAINTEXT,SSL:SSL,SASL_PLAINTEXT:SASL_PLAINTEXT,SASL_SSL:SASL_SSL listeners = null log.cleaner.backoff.ms = 15000 log.cleaner.dedupe.buffer.size = 134217728 log.cleaner.delete.retention.ms = 86400000 log.cleaner.enable = true log.cleaner.io.buffer.load.factor = 0.9 log.cleaner.io.buffer.size = 524288 log.cleaner.io.max.bytes.per.second = 1.7976931348623157E308 log.cleaner.min.cleanable.ratio = 0.5 log.cleaner.min.compaction.lag.ms = 0 log.cleaner.threads = 1 log.cleanup.policy = [delete] log.dir = /tmp/kafka-logs log.dirs = /tmp/kafka-logs log.flush.interval.messages = 9223372036854775807 log.flush.interval.ms = null log.flush.offset.checkpoint.interval.ms = 60000 log.flush.scheduler.interval.ms = 9223372036854775807 log.flush.start.offset.checkpoint.interval.ms = 60000 log.index.interval.bytes = 4096 log.index.size.max.bytes = 10485760 log.message.format.version = 1.1-IV0 log.message.timestamp.difference.max.ms = 9223372036854775807 log.message.timestamp.type = CreateTime log.preallocate = false log.retention.bytes = -1 log.retention.check.interval.ms = 300000 log.retention.hours = 168 log.retention.minutes = null log.retention.ms = null log.roll.hours = 168 log.roll.jitter.hours = 0 log.roll.jitter.ms = null log.roll.ms = null log.segment.bytes = 1073741824 log.segment.delete.delay.ms = 60000 max.connections.per.ip = 2147483647 max.connections.per.ip.overrides = max.incremental.fetch.session.cache.slots = 1000 message.max.bytes = 1000012 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 min.insync.replicas = 1 num.io.threads = 8 num.network.threads = 3 num.partitions = 2 num.recovery.threads.per.data.dir = 1 num.replica.alter.log.dirs.threads = null num.replica.fetchers = 1 offset.metadata.max.bytes = 4096 offsets.commit.required.acks = -1 offsets.commit.timeout.ms = 5000 offsets.load.buffer.size = 5242880 offsets.retention.check.interval.ms = 600000 offsets.retention.minutes = 1440 offsets.topic.compression.codec = 0 offsets.topic.num.partitions = 50 offsets.topic.replication.factor = 1 offsets.topic.segment.bytes = 104857600 password.encoder.cipher.algorithm = AES/CBC/PKCS5Padding password.encoder.iterations = 4096 password.encoder.key.length = 128 password.encoder.keyfactory.algorithm = null password.encoder.old.secret = null password.encoder.secret = null port = 9092 principal.builder.class = null producer.purgatory.purge.interval.requests = 1000 queued.max.request.bytes = -1 queued.max.requests = 500 quota.consumer.default = 9223372036854775807 quota.producer.default = 9223372036854775807 quota.window.num = 11 quota.window.size.seconds = 1 replica.fetch.backoff.ms = 1000 replica.fetch.max.bytes = 1048576 replica.fetch.min.bytes = 1 replica.fetch.response.max.bytes = 10485760 replica.fetch.wait.max.ms = 500 replica.high.watermark.checkpoint.interval.ms = 5000 replica.lag.time.max.ms = 10000 replica.socket.receive.buffer.bytes = 65536 replica.socket.timeout.ms = 30000 replication.quota.window.num = 11 replication.quota.window.size.seconds = 1 request.timeout.ms = 30000 reserved.broker.max.id = 1000 sasl.enabled.mechanisms = [GSSAPI] sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.principal.to.local.rules = [DEFAULT] sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism.inter.broker.protocol = GSSAPI security.inter.broker.protocol = PLAINTEXT socket.receive.buffer.bytes = 102400 socket.request.max.bytes = 104857600 socket.send.buffer.bytes = 102400 ssl.cipher.suites = [] ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.abort.timed.out.transaction.cleanup.interval.ms = 60000 transaction.max.timeout.ms = 900000 transaction.remove.expired.transaction.cleanup.interval.ms = 3600000 transaction.state.log.load.buffer.size = 5242880 transaction.state.log.min.isr = 2 transaction.state.log.num.partitions = 50 transaction.state.log.replication.factor = 3 transaction.state.log.segment.bytes = 104857600 transactional.id.expiration.ms = 604800000 unclean.leader.election.enable = false zookeeper.connect = zookeeper:2181 zookeeper.connection.timeout.ms = 6000 zookeeper.max.in.flight.requests = 10 zookeeper.session.timeout.ms = 6000 zookeeper.set.acl = false zookeeper.sync.time.ms = 2000 (kafka.server.KafkaConfig) [2020-04-15 13:58:32,965] INFO KafkaConfig values: advertised.host.name = null advertised.listeners = PLAINTEXT://192.168.99.100:9092 advertised.port = null alter.config.policy.class.name = null alter.log.dirs.replication.quota.window.num = 11 alter.log.dirs.replication.quota.window.size.seconds = 1 authorizer.class.name = auto.create.topics.enable = true auto.leader.rebalance.enable = true background.threads = 10 broker.id = 0 broker.id.generation.enable = true broker.rack = null compression.type = producer connections.max.idle.ms = 600000 controlled.shutdown.enable = true controlled.shutdown.max.retries = 3 controlled.shutdown.retry.backoff.ms = 5000 controller.socket.timeout.ms = 30000 create.topic.policy.class.name = null default.replication.factor = 1 delegation.token.expiry.check.interval.ms = 3600000 delegation.token.expiry.time.ms = 86400000 delegation.token.master.key = null delegation.token.max.lifetime.ms = 604800000 delete.records.purgatory.purge.interval.requests = 1 delete.topic.enable = true fetch.purgatory.purge.interval.requests = 1000 group.initial.rebalance.delay.ms = 3000 group.max.session.timeout.ms = 300000 group.min.session.timeout.ms = 6000 host.name = inter.broker.listener.name = null inter.broker.protocol.version = 1.1-IV0 leader.imbalance.check.interval.seconds = 300 leader.imbalance.per.broker.percentage = 10 listener.security.protocol.map = PLAINTEXT:PLAINTEXT,SSL:SSL,SASL_PLAINTEXT:SASL_PLAINTEXT,SASL_SSL:SASL_SSL listeners = null log.cleaner.backoff.ms = 15000 log.cleaner.dedupe.buffer.size = 134217728 log.cleaner.delete.retention.ms = 86400000 log.cleaner.enable = true log.cleaner.io.buffer.load.factor = 0.9 log.cleaner.io.buffer.size = 524288 log.cleaner.io.max.bytes.per.second = 1.7976931348623157E308 log.cleaner.min.cleanable.ratio = 0.5 log.cleaner.min.compaction.lag.ms = 0 log.cleaner.threads = 1 log.cleanup.policy = [delete] log.dir = /tmp/kafka-logs log.dirs = /tmp/kafka-logs log.flush.interval.messages = 9223372036854775807 log.flush.interval.ms = null log.flush.offset.checkpoint.interval.ms = 60000 log.flush.scheduler.interval.ms = 9223372036854775807 log.flush.start.offset.checkpoint.interval.ms = 60000 log.index.interval.bytes = 4096 log.index.size.max.bytes = 10485760 log.message.format.version = 1.1-IV0 log.message.timestamp.difference.max.ms = 9223372036854775807 log.message.timestamp.type = CreateTime log.preallocate = false log.retention.bytes = -1 log.retention.check.interval.ms = 300000 log.retention.hours = 168 log.retention.minutes = null log.retention.ms = null log.roll.hours = 168 log.roll.jitter.hours = 0 log.roll.jitter.ms = null log.roll.ms = null log.segment.bytes = 1073741824 log.segment.delete.delay.ms = 60000 max.connections.per.ip = 2147483647 max.connections.per.ip.overrides = max.incremental.fetch.session.cache.slots = 1000 message.max.bytes = 1000012 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 min.insync.replicas = 1 num.io.threads = 8 num.network.threads = 3 num.partitions = 2 num.recovery.threads.per.data.dir = 1 num.replica.alter.log.dirs.threads = null num.replica.fetchers = 1 offset.metadata.max.bytes = 4096 offsets.commit.required.acks = -1 offsets.commit.timeout.ms = 5000 offsets.load.buffer.size = 5242880 offsets.retention.check.interval.ms = 600000 offsets.retention.minutes = 1440 offsets.topic.compression.codec = 0 offsets.topic.num.partitions = 50 offsets.topic.replication.factor = 1 offsets.topic.segment.bytes = 104857600 password.encoder.cipher.algorithm = AES/CBC/PKCS5Padding password.encoder.iterations = 4096 password.encoder.key.length = 128 password.encoder.keyfactory.algorithm = null password.encoder.old.secret = null password.encoder.secret = null port = 9092 principal.builder.class = null producer.purgatory.purge.interval.requests = 1000 queued.max.request.bytes = -1 queued.max.requests = 500 quota.consumer.default = 9223372036854775807 quota.producer.default = 9223372036854775807 quota.window.num = 11 quota.window.size.seconds = 1 replica.fetch.backoff.ms = 1000 replica.fetch.max.bytes = 1048576 replica.fetch.min.bytes = 1 replica.fetch.response.max.bytes = 10485760 replica.fetch.wait.max.ms = 500 replica.high.watermark.checkpoint.interval.ms = 5000 replica.lag.time.max.ms = 10000 replica.socket.receive.buffer.bytes = 65536 replica.socket.timeout.ms = 30000 replication.quota.window.num = 11 replication.quota.window.size.seconds = 1 request.timeout.ms = 30000 reserved.broker.max.id = 1000 sasl.enabled.mechanisms = [GSSAPI] sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.principal.to.local.rules = [DEFAULT] sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism.inter.broker.protocol = GSSAPI security.inter.broker.protocol = PLAINTEXT socket.receive.buffer.bytes = 102400 socket.request.max.bytes = 104857600 socket.send.buffer.bytes = 102400 ssl.cipher.suites = [] ssl.client.auth = none ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.abort.timed.out.transaction.cleanup.interval.ms = 60000 transaction.max.timeout.ms = 900000 transaction.remove.expired.transaction.cleanup.interval.ms = 3600000 transaction.state.log.load.buffer.size = 5242880 transaction.state.log.min.isr = 2 transaction.state.log.num.partitions = 50 transaction.state.log.replication.factor = 3 transaction.state.log.segment.bytes = 104857600 transactional.id.expiration.ms = 604800000 unclean.leader.election.enable = false zookeeper.connect = zookeeper:2181 zookeeper.connection.timeout.ms = 6000 zookeeper.max.in.flight.requests = 10 zookeeper.session.timeout.ms = 6000 zookeeper.set.acl = false zookeeper.sync.time.ms = 2000 (kafka.server.KafkaConfig)

[2020-04-15 13:58:33,881] INFO Loading logs. (kafka.log.LogManager) [2020-04-15 13:58:34,038] INFO Logs loading complete in 147 ms. (kafka.log.LogManager) [2020-04-15 13:58:34,158] INFO Starting log cleanup with a period of 300000 ms. (kafka.log.LogManager) [2020-04-15 13:58:34,196] INFO Starting log flusher with a default period of 9223372036854775807 ms. (kafka.log.LogManager) [2020-04-15 13:58:42,787] INFO Awaiting socket connections on 0.0.0.0:9092. (kafka.network.Acceptor) [2020-04-15 13:58:43,970] INFO [SocketServer brokerId=0] Started 1 acceptor threads (kafka.network.SocketServer)

[2020-04-15 13:58:46,954] INFO Creating /brokers/ids/0 (is it secure? false) (kafka.zk.KafkaZkClient) [2020-04-15 13:58:47,029] INFO Result of znode creation at /brokers/ids/0 is: OK (kafka.zk.KafkaZkClient) [2020-04-15 13:58:47,062] INFO Registered broker 0 at path /brokers/ids/0 with addresses: ArrayBuffer(EndPoint(192.168.99.100,9092,ListenerName(PLAINT EXT),PLAINTEXT)) (kafka.zk.KafkaZkClient)

[2020-04-15 13:58:48,172] INFO Creating /controller (is it secure? false) (kafka.zk.KafkaZkClient) [2020-04-15 13:58:48,207] INFO Result of znode creation at /controller is: OK (kafka.zk.KafkaZkClient)

[2020-04-15 13:58:48,812] INFO [GroupCoordinator 0]: Starting up. (kafka.coordinator.group.GroupCoordinator) [2020-04-15 13:58:48,831] INFO [GroupCoordinator 0]: Startup complete. (kafka.coordinator.group.GroupCoordinator) [2020-04-15 13:58:49,015] INFO [GroupMetadataManager brokerId=0] Removed 0 expired offsets in 184 milliseconds. (kafka.coordinator.group.GroupMetadata Manager) [2020-04-15 13:58:49,282] INFO [ProducerId Manager 0]: Acquired new producerId block (brokerId:0,blockStartProducerId:1000,blockEndProducerId:1999) by writing to Zk with path version 2 (kafka.coordinator.transaction.ProducerIdManager) [2020-04-15 13:58:50,229] INFO [TransactionCoordinator id=0] Starting up. (kafka.coordinator.transaction.TransactionCoordinator) [2020-04-15 13:58:50,289] INFO [TransactionCoordinator id=0] Startup complete. (kafka.coordinator.transaction.TransactionCoordinator)

[2020-04-15 13:58:54,626] INFO [/config/changes-event-process-thread]: Starting (kafka.common.ZkNodeChangeNotificationListener$ChangeEventProcessThrea d) [2020-04-15 13:58:54,963] INFO Kafka version : 1.1.0 (org.apache.kafka.common.utils.AppInfoParser) [2020-04-15 13:58:54,969] INFO Kafka commitId : fdcf75ea326b8e07 (org.apache.kafka.common.utils.AppInfoParser) [2020-04-15 13:58:55,015] INFO [KafkaServer id=0] started (kafka.server.KafkaServer)

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>

cer commented 4 years ago

Strange. No obvious error. What's the output of docker logs ftgo-application-master_ftgo-order-service?

cer commented 4 years ago

Did you do anything to your machine 3 hours ago?

prakashid2 commented 4 years ago

3-4 hours before I tried the commands mentioned in README.adoc because I wanted to see the application running. Here is the output you requested.

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>docker logs ftgo-application-master_ftgo-order-service Error: No such container: ftgo-application-master_ftgo-order-service

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>

cer commented 4 years ago

Oops. Mistyped docker logs ftgo-application-master_ftgo-order-service_1

prakashid2 commented 4 years ago

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>docker logs ftgo-application-master_ftgo-order-service_1 2020-04-15 13:11:32.390 INFO [-,,,] 1 --- [ main] s.c.a.AnnotationConfigApplicationContext : Refreshing org.springframework.context.annotat ion.AnnotationConfigApplicationContext@7a92922: startup date [Wed Apr 15 13:11:32 GMT 2020]; root of context hierarchy 2020-04-15 13:11:55.018 INFO [-,,,] 1 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'configurationPropertiesRebinderAutoConfi guration' of type [org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration$$EnhancerBySpringCGLIB$$10aebd90] is not e ligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)

. _ _ /\ / '_ () \ \ \ \ ( ( )\ | ' | '| | ' \/ ` | \ \ \ \ \/ _)| |)| | | | | || (| | ) ) ) ) ' |__| .|| ||| |\, | / / / / =========|_|==============|__/=//// :: Spring Boot :: (v2.0.3.RELEASE)

2020-04-15 13:12:32.581 INFO [ftgo-order-service,,,] 1 --- [ main] n.c.f.o.main.OrderServiceMain : No active profile set, fallin g back to default profiles: default 2020-04-15 13:12:35.443 INFO [ftgo-order-service,,,] 1 --- [ main] ConfigServletWebServerApplicationContext : Refreshing org.springframewor k.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@4f7d0008: startup date [Wed Apr 15 13:12:35 GMT 2020]; parent: org.sprin gframework.context.annotation.AnnotationConfigApplicationContext@7a92922 2020-04-15 13:13:39.046 INFO [ftgo-order-service,,,] 1 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Overriding bean definition fo r bean 'sagaCommandProducer' with a different definition: replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=io.eventuate.tram.sagas.orchestration.SagaOrchestratorConfiguration; factory MethodName=sagaCommandProducer; initMethodName=null; destroyMethodName=(inferred); defined in io.eventuate.tram.sagas.orchestration.SagaOrchestratorCo nfiguration] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary =false; factoryBeanName=net.chrisrichardson.ftgo.orderservice.domain.OrderServiceConfiguration; factoryMethodName=sagaCommandProducer; initMethodName= null; destroyMethodName=(inferred); defined in net.chrisrichardson.ftgo.orderservice.domain.OrderServiceConfiguration] 2020-04-15 13:14:51.953 INFO [ftgo-order-service,,,] 1 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Overriding bean definition fo r bean 'dataSource' with a different definition: replacing [Root bean: class [null]; scope=refresh; abstract=false; lazyInit=false; autowireMode=3; de pendencyCheck=0; autowireCandidate=false; primary=false; factoryBeanName=org.springframework.boot.autoconfigure.jdbc.DataSourceConfiguration$Hikari; f actoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/springframework/boot/autoconfigure /jdbc/DataSourceConfiguration$Hikari.class]] with [Root bean: class [org.springframework.aop.scope.ScopedProxyFactoryBean]; scope=; abstract=false; la zyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=n ull; destroyMethodName=null; defined in BeanDefinition defined in class path resource [org/springframework/boot/autoconfigure/jdbc/DataSourceConfigura tion$Hikari.class]] 2020-04-15 13:15:22.512 INFO [ftgo-order-service,,,] 1 --- [ main] o.s.cloud.context.scope.GenericScope : BeanFactory id=152ca69f-2431- 33f8-8a00-97525ce0a50b 2020-04-15 13:16:46.505 INFO [ftgo-order-service,,,] 1 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.tra nsaction.annotation.ProxyTransactionManagementConfiguration' of type [org.springframework.transaction.annotation.ProxyTransactionManagementConfigurati on$$EnhancerBySpringCGLIB$$f494ba93] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-04-15 13:16:55.102 INFO [ftgo-order-service,,,] 1 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.clo ud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration' of type [org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAut oConfiguration$$EnhancerBySpringCGLIB$$10aebd90] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-p roxying) 2020-04-15 13:17:27.197 INFO [ftgo-order-service,,,] 1 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port( s): 8080 (http) 2020-04-15 13:17:29.325 INFO [ftgo-order-service,,,] 1 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2020-04-15 13:17:29.330 INFO [ftgo-order-service,,,] 1 --- [ main] org.apache.catalina.core.StandardEngine : Starting Servlet Engine: Apac he Tomcat/8.5.31 2020-04-15 13:17:29.968 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.a.catalina.core.AprLifecycleListener : The APR based Apache Tomcat N ative library which allows optimal performance in production environments was not found on the java.library.path: [/usr/lib/jvm/java-1.8-openjdk/jre/l ib/amd64/server:/usr/lib/jvm/java-1.8-openjdk/jre/lib/amd64:/usr/lib/jvm/java-1.8-openjdk/jre/../lib/amd64:/usr/java/packages/lib/amd64:/usr/lib64:/li b64:/lib:/usr/lib] 2020-04-15 13:17:37.160 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2020-04-15 13:17:37.185 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.web.context.ContextLoader : Root WebApplicationContext: i nitialization completed in 301749 ms 2020-04-15 13:19:56.329 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'characterEnc odingFilter' to: [/] 2020-04-15 13:19:56.465 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'tracingFilte r' to: [/] 2020-04-15 13:19:56.473 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'exceptionLog gingFilter' to: [/] 2020-04-15 13:19:56.477 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'traceIdRespo nseFilter' to: [/] 2020-04-15 13:19:56.481 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'hiddenHttpMe thodFilter' to: [/] 2020-04-15 13:19:56.483 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'httpPutFormC ontentFilter' to: [/] 2020-04-15 13:19:56.488 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'requestConte xtFilter' to: [/] 2020-04-15 13:19:56.488 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'httpTraceFil ter' to: [/] 2020-04-15 13:19:56.492 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'webMvcMetric sFilter' to: [/*] 2020-04-15 13:19:56.495 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.ServletRegistrationBean : Servlet dispatcherServlet map ped to [/] 2020-04-15 13:19:58.189 DEBUG [ftgo-order-service,,,] 1 --- [ost-startStop-1] n.c.f.o.web.TraceIdResponseFilter : Initializing filter 'traceIdR esponseFilter' 2020-04-15 13:19:58.192 DEBUG [ftgo-order-service,,,] 1 --- [ost-startStop-1] n.c.f.o.web.TraceIdResponseFilter : Filter 'traceIdResponseFilter ' configured successfully 2020-04-15 13:20:04.848 INFO [ftgo-order-service,,,] 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Starting... Wed Apr 15 13:20:12 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. 2020-04-15 13:20:17.913 INFO [ftgo-order-service,,,] 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Start complete d. Wed Apr 15 13:20:18 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 13:20:18 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 13:20:18 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 13:20:18 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 13:20:19 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 13:20:19 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 13:20:19 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 13:20:19 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 13:20:19 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. 2020-04-15 13:20:25.804 INFO [ftgo-order-service,,,] 1 --- [ main] j.LocalContainerEntityManagerFactoryBean : Building JPA container Entity ManagerFactory for persistence unit 'default' 2020-04-15 13:20:26.485 INFO [ftgo-order-service,,,] 1 --- [ main] o.hibernate.jpa.internal.util.LogHelper : HHH000204: Processing Persist enceUnitInfo [ name: default ...] 2020-04-15 13:20:33.674 INFO [ftgo-order-service,,,] 1 --- [ main] org.hibernate.Version : HHH000412: Hibernate Core {5. 2.17.Final} 2020-04-15 13:20:33.722 INFO [ftgo-order-service,,,] 1 --- [ main] org.hibernate.cfg.Environment : HHH000206: hibernate.properti es not found 2020-04-15 13:20:37.263 INFO [ftgo-order-service,,,] 1 --- [ main] o.hibernate.annotations.common.Version : HCANN000001: Hibernate Common s Annotations {5.0.1.Final} 2020-04-15 13:21:08.220 INFO [ftgo-order-service,,,] 1 --- [ main] org.hibernate.dialect.Dialect : HHH000400: Using dialect: org .hibernate.dialect.MySQL5Dialect 2020-04-15 13:21:15.075 WARN [ftgo-order-service,,,] 1 --- [ main] o.h.c.a.r.JPAOverriddenAnnotationReader : HHH000207: Property net.chris richardson.ftgo.common.Money.amount not found in class but described in (possible typo error) 2020-04-15 13:21:32.912 DEBUG [ftgo-order-service,,,] 1 --- [ main] org.hibernate.SQL : create table hibernate_sequen ce (next_val bigint) engine=MyISAM 2020-04-15 13:21:33.123 DEBUG [ftgo-order-service,,,] 1 --- [ main] org.hibernate.SQL : insert into hibernate_sequenc e values ( 1 ) 2020-04-15 13:21:33.142 DEBUG [ftgo-order-service,,,] 1 --- [ main] org.hibernate.SQL : create table order_line_items (order_id bigint not null, menu_item_id varchar(255), name varchar(255), price decimal(19,2), quantity integer not null) engine=MyISAM 2020-04-15 13:21:33.154 DEBUG [ftgo-order-service,,,] 1 --- [ main] org.hibernate.SQL : create table order_service_re staurant_menu_items (restaurant_id bigint not null, id varchar(255), name varchar(255), amount decimal(19,2)) engine=MyISAM 2020-04-15 13:21:33.177 DEBUG [ftgo-order-service,,,] 1 --- [ main] org.hibernate.SQL : create table order_service_re staurants (id bigint not null, name varchar(255), primary key (id)) engine=MyISAM 2020-04-15 13:21:33.224 DEBUG [ftgo-order-service,,,] 1 --- [ main] org.hibernate.SQL : create table orders (id bigin t not null, consumer_id bigint, city varchar(255), delivery_state varchar(255), street1 varchar(255), street2 varchar(255), zip varchar(255), delivery _time datetime, amount decimal(19,2), payment_token varchar(255), restaurant_id bigint, state varchar(255), version bigint, primary key (id)) engine=M yISAM 2020-04-15 13:21:33.338 DEBUG [ftgo-order-service,,,] 1 --- [ main] org.hibernate.SQL : alter table order_line_items add constraint FKdjnh2emxm9tt6mrpfabdvbs0c foreign key (order_id) references orders (id) 2020-04-15 13:21:33.354 DEBUG [ftgo-order-service,,,] 1 --- [ main] org.hibernate.SQL : alter table order_service_res taurant_menu_items add constraint FKd8e0f5k91mscv19sdtmlnekuf foreign key (restaurant_id) references order_service_restaurants (id) 2020-04-15 13:21:33.408 INFO [ftgo-order-service,,,] 1 --- [ main] j.LocalContainerEntityManagerFactoryBean : Initialized JPA EntityManager Factory for persistence unit 'default' 2020-04-15 13:21:56.443 DEBUG [ftgo-order-service,,,] 1 --- [ main] io.eventuate.common.id.IdGeneratorImpl : Mac address 2485377957901 2020-04-15 13:22:04.994 INFO [ftgo-order-service,,,] 1 --- [ main] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:9092] check.crcs = true client.id = connections.max.idle.ms = 540000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = net.chrisrichardson.ftgo.orderservice.sagas.createorder.CreateOrderSaga-consumer heartbeat.interval.ms = 3000 interceptor.classes = null internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 305000 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 30000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer

2020-04-15 13:22:05.676 WARN [ftgo-order-service,,,] 1 --- [ main] org.apache.kafka.clients.ClientUtils : Removing server kafka:9092 fr om bootstrap.servers as DNS resolution failed for kafka 2020-04-15 13:22:05.861 ERROR [ftgo-order-service,,,] 1 --- [ main] i.e.m.k.b.c.EventuateKafkaConsumer : Error subscribing

org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623) ~[kafka-clients-1.0.1.jar!/:na] at io.eventuate.messaging.kafka.basic.consumer.EventuateKafkaConsumer.start(EventuateKafkaConsumer.java:71) ~[eventuate-messaging-kafka-basic- consumer-0.3.0.RELEASE.jar!/:na] at io.eventuate.messaging.kafka.consumer.MessageConsumerKafkaImpl.subscribe(MessageConsumerKafkaImpl.java:50) [eventuate-messaging-kafka-consu mer-0.3.0.RELEASE.jar!/:na] at io.eventuate.tram.consumer.kafka.EventuateTramKafkaMessageConsumer.subscribe(EventuateTramKafkaMessageConsumer.java:23) [eventuate-tram-con sumer-kafka-0.22.0.RC5.jar!/:na] at io.eventuate.tram.consumer.common.MessageConsumerImpl.subscribe(MessageConsumerImpl.java:32) [eventuate-tram-consumer-common-0.22.0.RC5.jar !/:na] at io.eventuate.tram.sagas.orchestration.SagaManagerImpl.subscribeToReplyChannel(SagaManagerImpl.java:155) [eventuate-tram-sagas-orchestration -0.12.0.RC5.jar!/:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171] at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostPr ocessor.java:365) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotati onBeanPostProcessor.java:308) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBe anPostProcessor.java:135) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCa pableBeanFactory.java:424) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1700) [ spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:581) [spr ing-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) [sprin g-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) [spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) [spring-beans-5.0.7.RELEASE.jar!/:5.0 .7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7 .RELEASE] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251) ~[spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1138) ~[spring-bea ns-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1065) ~[spring-beans -5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:818) ~[spring-beans-5.0.7.R ELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:724) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:474) ~[spring-beans-5. 0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFacto ry.java:1256) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:110 5) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543) [spr ing-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) [sprin g-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) [spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) [spring-beans-5.0.7.RELEASE.jar!/:5.0 .7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7 .RELEASE] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251) ~[spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1138) ~[spring-bea ns-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1065) ~[spring-beans -5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:818) ~[spring-beans-5.0.7.R ELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:724) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:197) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:12 76) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:113 3) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543) [spr ing-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) [sprin g-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) [spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) [spring-beans-5.0.7.RELEASE.jar!/:5.0 .7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7 .RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:760) ~[spring -beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:869) ~[sprin g-context-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550) ~[spring-context-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140) ~[spri ng-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:759) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:395) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:327) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1255) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1243) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at net.chrisrichardson.ftgo.orderservice.main.OrderServiceMain.main(OrderServiceMain.java:23) ~[classes!/:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171] at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171] at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) ~[ftgo-order-service.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) ~[ftgo-order-service.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:50) ~[ftgo-order-service.jar:na] at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51) ~[ftgo-order-service.jar:na] Caused by: org.apache.kafka.common.config.ConfigException: No resolvable bootstrap urls given in bootstrap.servers at org.apache.kafka.clients.ClientUtils.parseAndValidateAddresses(ClientUtils.java:64) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:706) ~[kafka-clients-1.0.1.jar!/:na] ... 68 common frames omitted

2020-04-15 13:22:05.985 WARN [ftgo-order-service,,,] 1 --- [ main] ConfigServletWebServerApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name ' orderController' defined in URL [jar:file:/ftgo-order-service.jar!/BOOT-INF/classes!/net/chrisrichardson/ftgo/orderservice/web/OrderController.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyExceptio n: Error creating bean with name 'orderService' defined in net.chrisrichardson.ftgo.orderservice.domain.OrderServiceConfiguration: Unsatisfied depende ncy expressed through method 'orderService' parameter 3; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating b ean with name 'createOrderSagaManager': Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.kafka.common.Kafk aException: Failed to construct kafka consumer 2020-04-15 13:22:06.130 INFO [ftgo-order-service,,,] 1 --- [ main] j.LocalContainerEntityManagerFactoryBean : Closing JPA EntityManagerFact ory for persistence unit 'default' 2020-04-15 13:22:07.707 WARN [ftgo-order-service,,,] 1 --- [ main] z.r.AsyncReporter$BoundedAsyncReporter : Timed out waiting for in-flig ht spans to send 2020-04-15 13:22:07.817 INFO [ftgo-order-service,,,] 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown initi ated... 2020-04-15 13:22:08.472 INFO [ftgo-order-service,,,] 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown compl eted. 2020-04-15 13:22:08.509 INFO [ftgo-order-service,,,] 1 --- [ main] o.apache.catalina.core.StandardService : Stopping service [Tomcat] 2020-04-15 13:22:09.086 WARN [ftgo-order-service,,,] 1 --- [ost-startStop-2] o.a.c.loader.WebappClassLoaderBase : The web application [ROOT] ap pears to have started a thread named [AsyncReporter{org.springframework.cloud.sleuth.zipkin2.sender.RestTemplateSender@4f33e066}] but has failed to st op it. This is very likely to create a memory leak. Stack trace of thread: sun.net.www.protocol.http.HttpURLConnection$7.run(HttpURLConnection.java:1142) sun.net.www.protocol.http.HttpURLConnection$7.run(HttpURLConnection.java:1140) java.security.AccessController.doPrivileged(Native Method) sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1139) sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1050) sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:984) org.springframework.http.client.SimpleBufferingClientHttpRequest.executeInternal(SimpleBufferingClientHttpRequest.java:76) org.springframework.http.client.AbstractBufferingClientHttpRequest.executeInternal(AbstractBufferingClientHttpRequest.java:48) org.springframework.http.client.AbstractClientHttpRequest.execute(AbstractClientHttpRequest.java:53) org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:723) org.springframework.cloud.sleuth.zipkin2.sender.ZipkinRestTemplateWrapper.doExecute(ZipkinRestTemplateSenderConfiguration.java:132) org.springframework.web.client.RestTemplate.exchange(RestTemplate.java:658) org.springframework.cloud.sleuth.zipkin2.sender.RestTemplateSender.post(RestTemplateSender.java:112) org.springframework.cloud.sleuth.zipkin2.sender.RestTemplateSender$HttpPostCall.doExecute(RestTemplateSender.java:123) org.springframework.cloud.sleuth.zipkin2.sender.RestTemplateSender$HttpPostCall.doExecute(RestTemplateSender.java:115) zipkin2.Call$Base.execute(Call.java:379) zipkin2.reporter.AsyncReporter$BoundedAsyncReporter.flush(AsyncReporter.java:286) zipkin2.reporter.AsyncReporter$Builder$1.run(AsyncReporter.java:190) 2020-04-15 13:22:10.350 INFO [ftgo-order-service,,,] 1 --- [ main] ConditionEvaluationReportLoggingListener :

Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled. 2020-04-15 13:22:10.849 ERROR [ftgo-order-service,,,] 1 --- [ main] o.s.boot.SpringApplication : Application run failed

org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'orderController' defined in URL [jar:file:/ftgo-order -service.jar!/BOOT-INF/classes!/net/chrisrichardson/ftgo/orderservice/web/OrderController.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'orderService' defin ed in net.chrisrichardson.ftgo.orderservice.domain.OrderServiceConfiguration: Unsatisfied dependency expressed through method 'orderService' parameter 3; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'createOrderSagaManager': Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:732) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:197) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:12 76) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:113 3) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543) ~[sp ring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) ~[spri ng-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) ~[spring-beans-5.0.7.RELEASE .jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) ~[spring-beans-5.0.7.RELEASE.jar!/:5. 0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0. 7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:760) ~[spring -beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:869) ~[sprin g-context-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550) ~[spring-context-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140) ~[spri ng-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:759) [spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:395) [spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:327) [spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1255) [spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1243) [spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at net.chrisrichardson.ftgo.orderservice.main.OrderServiceMain.main(OrderServiceMain.java:23) [classes!/:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171] at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171] at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) [ftgo-order-service.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) [ftgo-order-service.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:50) [ftgo-order-service.jar:na] at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51) [ftgo-order-service.jar:na] Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'orderService' defined in net.chrisrichards on.ftgo.orderservice.domain.OrderServiceConfiguration: Unsatisfied dependency expressed through method 'orderService' parameter 3; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'createOrderSagaManager': Invocation of init method failed; ne sted exception is java.lang.RuntimeException: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:732) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:474) ~[spring-beans-5. 0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFacto ry.java:1256) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:110 5) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543) ~[sp ring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) ~[spri ng-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) ~[spring-beans-5.0.7.RELEASE .jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) ~[spring-beans-5.0.7.RELEASE.jar!/:5. 0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0. 7.RELEASE] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251) ~[spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1138) ~[spring-bea ns-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1065) ~[spring-beans -5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:818) ~[spring-beans-5.0.7.R ELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:724) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] ... 27 common frames omitted Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'createOrderSagaManager': Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBe anPostProcessor.java:138) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCa pableBeanFactory.java:424) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1700) ~ [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:581) ~[sp ring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) ~[spri ng-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) ~[spring-beans-5.0.7.RELEASE .jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) ~[spring-beans-5.0.7.RELEASE.jar!/:5. 0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0. 7.RELEASE] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251) ~[spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1138) ~[spring-bea ns-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1065) ~[spring-beans -5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:818) ~[spring-beans-5.0.7.R ELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:724) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] ... 41 common frames omitted Caused by: java.lang.RuntimeException: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at io.eventuate.messaging.kafka.basic.consumer.EventuateKafkaConsumer.start(EventuateKafkaConsumer.java:115) ~[eventuate-messaging-kafka-basic -consumer-0.3.0.RELEASE.jar!/:na] at io.eventuate.messaging.kafka.consumer.MessageConsumerKafkaImpl.subscribe(MessageConsumerKafkaImpl.java:50) ~[eventuate-messaging-kafka-cons umer-0.3.0.RELEASE.jar!/:na] at io.eventuate.tram.consumer.kafka.EventuateTramKafkaMessageConsumer.subscribe(EventuateTramKafkaMessageConsumer.java:23) ~[eventuate-tram-co nsumer-kafka-0.22.0.RC5.jar!/:na] at io.eventuate.tram.consumer.common.MessageConsumerImpl.subscribe(MessageConsumerImpl.java:32) ~[eventuate-tram-consumer-common-0.22.0.RC5.ja r!/:na] at io.eventuate.tram.sagas.orchestration.SagaManagerImpl.subscribeToReplyChannel(SagaManagerImpl.java:155) ~[eventuate-tram-sagas-orchestratio n-0.12.0.RC5.jar!/:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171] at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostPr ocessor.java:365) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotati onBeanPostProcessor.java:308) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBe anPostProcessor.java:135) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] ... 54 common frames omitted Caused by: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623) ~[kafka-clients-1.0.1.jar!/:na] at io.eventuate.messaging.kafka.basic.consumer.EventuateKafkaConsumer.start(EventuateKafkaConsumer.java:71) ~[eventuate-messaging-kafka-basic- consumer-0.3.0.RELEASE.jar!/:na] ... 65 common frames omitted Caused by: org.apache.kafka.common.config.ConfigException: No resolvable bootstrap urls given in bootstrap.servers at org.apache.kafka.clients.ClientUtils.parseAndValidateAddresses(ClientUtils.java:64) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:706) ~[kafka-clients-1.0.1.jar!/:na] ... 68 common frames omitted

2020-04-15 13:59:00.405 INFO [-,,,] 1 --- [ main] s.c.a.AnnotationConfigApplicationContext : Refreshing org.springframework.context.annotat ion.AnnotationConfigApplicationContext@77f03bb1: startup date [Wed Apr 15 13:59:00 GMT 2020]; root of context hierarchy 2020-04-15 13:59:15.590 INFO [-,,,] 1 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'configurationPropertiesRebinderAutoConfi guration' of type [org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration$$EnhancerBySpringCGLIB$$29d53d88] is not e ligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)

. _ _ /\ / '_ () \ \ \ \ ( ( )\ | ' | '| | ' \/ ` | \ \ \ \ \/ _)| |)| | | | | || (| | ) ) ) ) ' |__| .|| ||| |\, | / / / / =========|_|==============|__/=//// :: Spring Boot :: (v2.0.3.RELEASE)

2020-04-15 13:59:39.122 INFO [ftgo-order-service,,,] 1 --- [ main] n.c.f.o.main.OrderServiceMain : No active profile set, fallin g back to default profiles: default 2020-04-15 13:59:39.891 INFO [ftgo-order-service,,,] 1 --- [ main] ConfigServletWebServerApplicationContext : Refreshing org.springframewor k.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@5f341870: startup date [Wed Apr 15 13:59:39 GMT 2020]; parent: org.sprin gframework.context.annotation.AnnotationConfigApplicationContext@77f03bb1 2020-04-15 14:00:12.094 INFO [ftgo-order-service,,,] 1 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Overriding bean definition fo r bean 'sagaCommandProducer' with a different definition: replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=io.eventuate.tram.sagas.orchestration.SagaOrchestratorConfiguration; factory MethodName=sagaCommandProducer; initMethodName=null; destroyMethodName=(inferred); defined in io.eventuate.tram.sagas.orchestration.SagaOrchestratorCo nfiguration] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary =false; factoryBeanName=net.chrisrichardson.ftgo.orderservice.domain.OrderServiceConfiguration; factoryMethodName=sagaCommandProducer; initMethodName= null; destroyMethodName=(inferred); defined in net.chrisrichardson.ftgo.orderservice.domain.OrderServiceConfiguration] 2020-04-15 14:00:34.910 INFO [ftgo-order-service,,,] 1 --- [ main] o.s.b.f.s.DefaultListableBeanFactory : Overriding bean definition fo r bean 'dataSource' with a different definition: replacing [Root bean: class [null]; scope=refresh; abstract=false; lazyInit=false; autowireMode=3; de pendencyCheck=0; autowireCandidate=false; primary=false; factoryBeanName=org.springframework.boot.autoconfigure.jdbc.DataSourceConfiguration$Hikari; f actoryMethodName=dataSource; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/springframework/boot/autoconfigure /jdbc/DataSourceConfiguration$Hikari.class]] with [Root bean: class [org.springframework.aop.scope.ScopedProxyFactoryBean]; scope=; abstract=false; la zyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=n ull; destroyMethodName=null; defined in BeanDefinition defined in class path resource [org/springframework/boot/autoconfigure/jdbc/DataSourceConfigura tion$Hikari.class]] 2020-04-15 14:01:04.097 INFO [ftgo-order-service,,,] 1 --- [ main] o.s.cloud.context.scope.GenericScope : BeanFactory id=152ca69f-2431- 33f8-8a00-97525ce0a50b 2020-04-15 14:01:21.038 INFO [ftgo-order-service,,,] 1 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.tra nsaction.annotation.ProxyTransactionManagementConfiguration' of type [org.springframework.transaction.annotation.ProxyTransactionManagementConfigurati on$$EnhancerBySpringCGLIB$$dbb3a8b] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-04-15 14:01:23.593 INFO [ftgo-order-service,,,] 1 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.clo ud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration' of type [org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAut oConfiguration$$EnhancerBySpringCGLIB$$29d53d88] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-p roxying) 2020-04-15 14:01:35.231 INFO [ftgo-order-service,,,] 1 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port( s): 8080 (http) 2020-04-15 14:01:36.702 INFO [ftgo-order-service,,,] 1 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2020-04-15 14:01:36.710 INFO [ftgo-order-service,,,] 1 --- [ main] org.apache.catalina.core.StandardEngine : Starting Servlet Engine: Apac he Tomcat/8.5.31 2020-04-15 14:01:37.420 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.a.catalina.core.AprLifecycleListener : The APR based Apache Tomcat N ative library which allows optimal performance in production environments was not found on the java.library.path: [/usr/lib/jvm/java-1.8-openjdk/jre/l ib/amd64/server:/usr/lib/jvm/java-1.8-openjdk/jre/lib/amd64:/usr/lib/jvm/java-1.8-openjdk/jre/../lib/amd64:/usr/java/packages/lib/amd64:/usr/lib64:/li b64:/lib:/usr/lib] 2020-04-15 14:01:40.939 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2020-04-15 14:01:40.946 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.web.context.ContextLoader : Root WebApplicationContext: i nitialization completed in 121051 ms 2020-04-15 14:02:47.698 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'characterEnc odingFilter' to: [/] 2020-04-15 14:02:47.749 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'tracingFilte r' to: [/] 2020-04-15 14:02:47.762 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'exceptionLog gingFilter' to: [/] 2020-04-15 14:02:47.764 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'traceIdRespo nseFilter' to: [/] 2020-04-15 14:02:47.765 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'hiddenHttpMe thodFilter' to: [/] 2020-04-15 14:02:47.765 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'httpPutFormC ontentFilter' to: [/] 2020-04-15 14:02:47.765 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'requestConte xtFilter' to: [/] 2020-04-15 14:02:47.766 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'httpTraceFil ter' to: [/] 2020-04-15 14:02:47.768 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean : Mapping filter: 'webMvcMetric sFilter' to: [/*] 2020-04-15 14:02:47.775 INFO [ftgo-order-service,,,] 1 --- [ost-startStop-1] o.s.b.w.servlet.ServletRegistrationBean : Servlet dispatcherServlet map ped to [/] 2020-04-15 14:02:50.030 DEBUG [ftgo-order-service,,,] 1 --- [ost-startStop-1] n.c.f.o.web.TraceIdResponseFilter : Initializing filter 'traceIdR esponseFilter' 2020-04-15 14:02:50.071 DEBUG [ftgo-order-service,,,] 1 --- [ost-startStop-1] n.c.f.o.web.TraceIdResponseFilter : Filter 'traceIdResponseFilter ' configured successfully 2020-04-15 14:02:52.374 INFO [ftgo-order-service,,,] 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Starting... Wed Apr 15 14:02:55 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. 2020-04-15 14:02:57.971 INFO [ftgo-order-service,,,] 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Start complete d. Wed Apr 15 14:02:58 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 14:02:58 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 14:02:58 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 14:02:58 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 14:02:58 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 14:02:58 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 14:02:58 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 14:02:58 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. Wed Apr 15 14:02:58 GMT 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL =true and provide truststore for server certificate verification. 2020-04-15 14:02:59.329 INFO [ftgo-order-service,,,] 1 --- [ main] j.LocalContainerEntityManagerFactoryBean : Building JPA container Entity ManagerFactory for persistence unit 'default' 2020-04-15 14:02:59.859 INFO [ftgo-order-service,,,] 1 --- [ main] o.hibernate.jpa.internal.util.LogHelper : HHH000204: Processing Persist enceUnitInfo [ name: default ...] 2020-04-15 14:03:02.768 INFO [ftgo-order-service,,,] 1 --- [ main] org.hibernate.Version : HHH000412: Hibernate Core {5. 2.17.Final} 2020-04-15 14:03:02.781 INFO [ftgo-order-service,,,] 1 --- [ main] org.hibernate.cfg.Environment : HHH000206: hibernate.properti es not found 2020-04-15 14:03:03.295 INFO [ftgo-order-service,,,] 1 --- [ main] o.hibernate.annotations.common.Version : HCANN000001: Hibernate Common s Annotations {5.0.1.Final} 2020-04-15 14:03:09.317 INFO [ftgo-order-service,,,] 1 --- [ main] org.hibernate.dialect.Dialect : HHH000400: Using dialect: org .hibernate.dialect.MySQL5Dialect 2020-04-15 14:03:11.150 WARN [ftgo-order-service,,,] 1 --- [ main] o.h.c.a.r.JPAOverriddenAnnotationReader : HHH000207: Property net.chris richardson.ftgo.common.Money.amount not found in class but described in (possible typo error) 2020-04-15 14:03:17.133 DEBUG [ftgo-order-service,,,] 1 --- [ main] org.hibernate.SQL : alter table order_line_items add constraint FKdjnh2emxm9tt6mrpfabdvbs0c foreign key (order_id) references orders (id) 2020-04-15 14:03:17.250 DEBUG [ftgo-order-service,,,] 1 --- [ main] org.hibernate.SQL : alter table order_service_res taurant_menu_items add constraint FKd8e0f5k91mscv19sdtmlnekuf foreign key (restaurant_id) references order_service_restaurants (id) 2020-04-15 14:03:17.378 INFO [ftgo-order-service,,,] 1 --- [ main] j.LocalContainerEntityManagerFactoryBean : Initialized JPA EntityManager Factory for persistence unit 'default' 2020-04-15 14:03:22.409 DEBUG [ftgo-order-service,,,] 1 --- [ main] io.eventuate.common.id.IdGeneratorImpl : Mac address 2485377957900 2020-04-15 14:03:24.437 INFO [ftgo-order-service,,,] 1 --- [ main] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [kafka:9092] check.crcs = true client.id = connections.max.idle.ms = 540000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = net.chrisrichardson.ftgo.orderservice.sagas.createorder.CreateOrderSaga-consumer heartbeat.interval.ms = 3000 interceptor.classes = null internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 305000 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 30000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer

2020-04-15 14:03:24.546 WARN [ftgo-order-service,,,] 1 --- [ main] org.apache.kafka.clients.ClientUtils : Removing server kafka:9092 fr om bootstrap.servers as DNS resolution failed for kafka 2020-04-15 14:03:24.576 ERROR [ftgo-order-service,,,] 1 --- [ main] i.e.m.k.b.c.EventuateKafkaConsumer : Error subscribing

org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623) ~[kafka-clients-1.0.1.jar!/:na] at io.eventuate.messaging.kafka.basic.consumer.EventuateKafkaConsumer.start(EventuateKafkaConsumer.java:71) ~[eventuate-messaging-kafka-basic- consumer-0.3.0.RELEASE.jar!/:na] at io.eventuate.messaging.kafka.consumer.MessageConsumerKafkaImpl.subscribe(MessageConsumerKafkaImpl.java:50) [eventuate-messaging-kafka-consu mer-0.3.0.RELEASE.jar!/:na] at io.eventuate.tram.consumer.kafka.EventuateTramKafkaMessageConsumer.subscribe(EventuateTramKafkaMessageConsumer.java:23) [eventuate-tram-con sumer-kafka-0.22.0.RC5.jar!/:na] at io.eventuate.tram.consumer.common.MessageConsumerImpl.subscribe(MessageConsumerImpl.java:32) [eventuate-tram-consumer-common-0.22.0.RC5.jar !/:na] at io.eventuate.tram.sagas.orchestration.SagaManagerImpl.subscribeToReplyChannel(SagaManagerImpl.java:155) [eventuate-tram-sagas-orchestration -0.12.0.RC5.jar!/:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171] at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostPr ocessor.java:365) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotati onBeanPostProcessor.java:308) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBe anPostProcessor.java:135) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCa pableBeanFactory.java:424) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1700) [ spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:581) [spr ing-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) [sprin g-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) [spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) [spring-beans-5.0.7.RELEASE.jar!/:5.0 .7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7 .RELEASE] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251) ~[spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1138) ~[spring-bea ns-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1065) ~[spring-beans -5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:818) ~[spring-beans-5.0.7.R ELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:724) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:474) ~[spring-beans-5. 0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFacto ry.java:1256) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:110 5) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543) [spr ing-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) [sprin g-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) [spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) [spring-beans-5.0.7.RELEASE.jar!/:5.0 .7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7 .RELEASE] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251) ~[spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1138) ~[spring-bea ns-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1065) ~[spring-beans -5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:818) ~[spring-beans-5.0.7.R ELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:724) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:197) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:12 76) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:113 3) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543) [spr ing-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) [sprin g-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) [spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) [spring-beans-5.0.7.RELEASE.jar!/:5.0 .7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7 .RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:760) ~[spring -beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:869) ~[sprin g-context-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550) ~[spring-context-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140) ~[spri ng-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:759) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:395) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:327) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1255) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1243) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at net.chrisrichardson.ftgo.orderservice.main.OrderServiceMain.main(OrderServiceMain.java:23) ~[classes!/:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171] at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171] at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) ~[ftgo-order-service.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) ~[ftgo-order-service.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:50) ~[ftgo-order-service.jar:na] at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51) ~[ftgo-order-service.jar:na] Caused by: org.apache.kafka.common.config.ConfigException: No resolvable bootstrap urls given in bootstrap.servers at org.apache.kafka.clients.ClientUtils.parseAndValidateAddresses(ClientUtils.java:64) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:706) ~[kafka-clients-1.0.1.jar!/:na] ... 68 common frames omitted

2020-04-15 14:03:24.583 WARN [ftgo-order-service,,,] 1 --- [ main] ConfigServletWebServerApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name ' orderController' defined in URL [jar:file:/ftgo-order-service.jar!/BOOT-INF/classes!/net/chrisrichardson/ftgo/orderservice/web/OrderController.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyExceptio n: Error creating bean with name 'orderService' defined in net.chrisrichardson.ftgo.orderservice.domain.OrderServiceConfiguration: Unsatisfied depende ncy expressed through method 'orderService' parameter 3; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating b ean with name 'createOrderSagaManager': Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.kafka.common.Kafk aException: Failed to construct kafka consumer 2020-04-15 14:03:24.585 INFO [ftgo-order-service,,,] 1 --- [ main] j.LocalContainerEntityManagerFactoryBean : Closing JPA EntityManagerFact ory for persistence unit 'default' 2020-04-15 14:03:25.136 INFO [ftgo-order-service,,,] 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown initi ated... 2020-04-15 14:03:25.154 INFO [ftgo-order-service,,,] 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown compl eted. 2020-04-15 14:03:25.159 INFO [ftgo-order-service,,,] 1 --- [ main] o.apache.catalina.core.StandardService : Stopping service [Tomcat] 2020-04-15 14:03:25.242 INFO [ftgo-order-service,,,] 1 --- [ main] ConditionEvaluationReportLoggingListener :

Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled. 2020-04-15 14:03:25.251 ERROR [ftgo-order-service,,,] 1 --- [ main] o.s.boot.SpringApplication : Application run failed

org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'orderController' defined in URL [jar:file:/ftgo-order -service.jar!/BOOT-INF/classes!/net/chrisrichardson/ftgo/orderservice/web/OrderController.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'orderService' defin ed in net.chrisrichardson.ftgo.orderservice.domain.OrderServiceConfiguration: Unsatisfied dependency expressed through method 'orderService' parameter 3; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'createOrderSagaManager': Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:732) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:197) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:12 76) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:113 3) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543) ~[sp ring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) ~[spri ng-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) ~[spring-beans-5.0.7.RELEASE .jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) ~[spring-beans-5.0.7.RELEASE.jar!/:5. 0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0. 7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:760) ~[spring -beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:869) ~[sprin g-context-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550) ~[spring-context-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140) ~[spri ng-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:759) [spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:395) [spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:327) [spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1255) [spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1243) [spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at net.chrisrichardson.ftgo.orderservice.main.OrderServiceMain.main(OrderServiceMain.java:23) [classes!/:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171] at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171] at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) [ftgo-order-service.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) [ftgo-order-service.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:50) [ftgo-order-service.jar:na] at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51) [ftgo-order-service.jar:na] Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'orderService' defined in net.chrisrichards on.ftgo.orderservice.domain.OrderServiceConfiguration: Unsatisfied dependency expressed through method 'orderService' parameter 3; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'createOrderSagaManager': Invocation of init method failed; ne sted exception is java.lang.RuntimeException: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:732) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:474) ~[spring-beans-5. 0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFacto ry.java:1256) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:110 5) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543) ~[sp ring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) ~[spri ng-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) ~[spring-beans-5.0.7.RELEASE .jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) ~[spring-beans-5.0.7.RELEASE.jar!/:5. 0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0. 7.RELEASE] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251) ~[spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1138) ~[spring-bea ns-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1065) ~[spring-beans -5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:818) ~[spring-beans-5.0.7.R ELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:724) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] ... 27 common frames omitted Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'createOrderSagaManager': Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBe anPostProcessor.java:138) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCa pableBeanFactory.java:424) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1700) ~ [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:581) ~[sp ring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) ~[spri ng-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) ~[spring-beans-5.0.7.RELEASE .jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) ~[spring-beans-5.0.7.RELEASE.jar!/:5. 0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0. 7.RELEASE] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251) ~[spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1138) ~[spring-bea ns-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1065) ~[spring-beans -5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:818) ~[spring-beans-5.0.7.R ELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:724) ~[spring-beans-5.0.7.RELEAS E.jar!/:5.0.7.RELEASE] ... 41 common frames omitted Caused by: java.lang.RuntimeException: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at io.eventuate.messaging.kafka.basic.consumer.EventuateKafkaConsumer.start(EventuateKafkaConsumer.java:115) ~[eventuate-messaging-kafka-basic -consumer-0.3.0.RELEASE.jar!/:na] at io.eventuate.messaging.kafka.consumer.MessageConsumerKafkaImpl.subscribe(MessageConsumerKafkaImpl.java:50) ~[eventuate-messaging-kafka-cons umer-0.3.0.RELEASE.jar!/:na] at io.eventuate.tram.consumer.kafka.EventuateTramKafkaMessageConsumer.subscribe(EventuateTramKafkaMessageConsumer.java:23) ~[eventuate-tram-co nsumer-kafka-0.22.0.RC5.jar!/:na] at io.eventuate.tram.consumer.common.MessageConsumerImpl.subscribe(MessageConsumerImpl.java:32) ~[eventuate-tram-consumer-common-0.22.0.RC5.ja r!/:na] at io.eventuate.tram.sagas.orchestration.SagaManagerImpl.subscribeToReplyChannel(SagaManagerImpl.java:155) ~[eventuate-tram-sagas-orchestratio n-0.12.0.RC5.jar!/:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171] at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostPr ocessor.java:365) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotati onBeanPostProcessor.java:308) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBe anPostProcessor.java:135) ~[spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] ... 54 common frames omitted Caused by: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623) ~[kafka-clients-1.0.1.jar!/:na] at io.eventuate.messaging.kafka.basic.consumer.EventuateKafkaConsumer.start(EventuateKafkaConsumer.java:71) ~[eventuate-messaging-kafka-basic- consumer-0.3.0.RELEASE.jar!/:na] ... 65 common frames omitted Caused by: org.apache.kafka.common.config.ConfigException: No resolvable bootstrap urls given in bootstrap.servers at org.apache.kafka.clients.ClientUtils.parseAndValidateAddresses(ClientUtils.java:64) ~[kafka-clients-1.0.1.jar!/:na] at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:706) ~[kafka-clients-1.0.1.jar!/:na] ... 68 common frames omitted

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>

cer commented 4 years ago

I would run the following commands in sequence:

  1. docker-compose up -d kafka
  2. docker-compose up -d cdc-service
  3. docker-compose up -d order-service

After running each command I would wait a while to make sure that the container starts and in the case of the cdc-service/order-service, docker ps shows it has healthy.

If everything seems ok I would then run docker-compose up -d so start the remaining services.

If the containers are exiting for no apparent reason, it's possible that they are running out of memory and that you need to reconfigure the Docker VM.

prakashid2 commented 4 years ago
  1. docker-compose up -d kafka
  2. docker-compose up -d cdc-service

Above two services started successfully. Tried running docker-compose up -d order-service What I need to give 'order-service' or something other? Output as shown below

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>docker-compose up -d order-service ERROR: No such service: order-service

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>

cer commented 4 years ago

oops. docker-compose up -d ftgo-order-service

If you look in the docker-compose.yml file you can see the container names and fix my typos :-)

prakashid2 commented 4 years ago

I ran following commands. Output does not show any error. But http://192.168.99.100:8889 cannot be reached.

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>docker-compose up -d ftgo-order-service ftgo-application-master_zookeeper_1 is up-to-date ftgo-application-master_zipkin_1 is up-to-date ftgo-application-master_mysql_1 is up-to-date ftgo-application-master_kafka_1 is up-to-date ftgo-application-master_cdc-service_1 is up-to-date Starting ftgo-application-master_ftgo-order-service_1 ... done

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 39b4be440915 ftgo-application-master_ftgo-order-service "/bin/sh -c 'java ${" 4 hours ago Up About a minute (healthy) 0.0.0. 0:8082->8080/tcp ftgo-application-master_ftgo-order-service_1 32e478759f46 eventuateio/eventuate-cdc-service:0.4.0.RELEASE "/bin/sh -c 'java ${" 4 hours ago Up 21 minutes (healthy) 0.0.0. 0:8099->8080/tcp ftgo-application-master_cdc-service_1 47110c01a6a7 eventuateio/eventuate-kafka:0.3.0.RELEASE "/bin/bash -c ./run-" 4 hours ago Up 25 minutes 0.0.0. 0:9092->9092/tcp ftgo-application-master_kafka_1 7b6d9f6411f0 ftgo-application-master_dynamodblocal-init "/bin/sh -c './wait-" 4 hours ago Up 4 hours (healthy) ftgo-application-master_dynamodblocal-init_1 37f828ac67e0 openzipkin/zipkin:2.5.0 "/bin/sh -c 'test -n" 4 hours ago Up 4 hours 9410/t cp, 0.0.0.0:9411->9411/tcp ftgo-application-master_zipkin_1 fcd514af0edd eventuateio/eventuate-zookeeper:0.4.0.RELEASE "/usr/local/zookeepe" 4 hours ago Up 4 hours 2888/t cp, 0.0.0.0:2181->2181/tcp, 3888/tcp ftgo-application-master_zookeeper_1 4f61d23cafb7 ftgo-application-master_mysql "docker-entrypoint.s" 4 hours ago Up 4 hours 0.0.0. 0:3306->3306/tcp ftgo-application-master_mysql_1 cdb12f2404fd ftgo-application-master_dynamodblocal "/bin/sh -c 'java -j" 4 hours ago Up 4 hours (healthy) 0.0.0. 0:8000->8000/tcp ftgo-application-master_dynamodblocal_1 dc91e5a7b938 ftgo-application-master_ftgo-api-gateway "/bin/sh -c 'java ${" 4 hours ago Up 4 hours (healthy) 0.0.0. 0:8087->8080/tcp ftgo-application-master_ftgo-api-gateway_1

D:\AllWorkspaces\manningWorkspace\ftgo-application-master> D:\AllWorkspaces\manningWorkspace\ftgo-application-master> D:\AllWorkspaces\manningWorkspace\ftgo-application-master> D:\AllWorkspaces\manningWorkspace\ftgo-application-master>docker-compose up -d ftgo-application-master_mysql_1 is up-to-date ftgo-application-master_zookeeper_1 is up-to-date ftgo-application-master_dynamodblocal_1 is up-to-date ftgo-application-master_zipkin_1 is up-to-date ftgo-application-master_ftgo-api-gateway_1 is up-to-date ftgo-application-master_kafka_1 is up-to-date ftgo-application-master_dynamodblocal-init_1 is up-to-date ftgo-application-master_cdc-service_1 is up-to-date Starting ftgo-application-master_ftgo-accounting-service_1 ... Starting ftgo-application-master_ftgo-accounting-service_1 ... done Starting ftgo-application-master_ftgo-restaurant-service_1 ... done Starting ftgo-application-master_ftgo-kitchen-service_1 ... done Starting ftgo-application-master_ftgo-order-history-service_1 ... done Starting ftgo-application-master_ftgo-consumer-service_1 ... done

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>

cer commented 4 years ago

Are you saying that you ran

docker run -p 8889:8888 -e DOCKER_DIAGNOSTICS_PORT=8889 -e DOCKER_HOST_IP \
     --rm eventuateio/eventuateio-docker-networking-diagnostics:0.2.0.RELEASE

And it did not work? What was the error?

cer commented 4 years ago

More importantly:

  1. What's the output of docker ps -a?
  2. Can you access the swagger URLs?
prakashid2 commented 4 years ago

The processes are very short lived about 20 mins and they exit after some time(reason unknown). Now docker ps -a is showing that

  1. ftgo-application-master_ftgo-order-history-service_1 exited 20 mins ago but I know for sure I tried accessing swagger UI for this much before its exit
  2. same with ftgo-application-master_ftgo-consumer-service_1
  3. ftgo-application-master_ftgo-order-service_1
  4. ftgo-application-master_ftgo-accounting-service_1
  5. eventuateio/eventuate-cdc-service:0.4.0.RELEASE unhealthy

I had tried to access swagger UIs immediately after the docker compose got up. But no luck.

Also verified the DOCKER_HOST_IP with following command. Shows success. But actually connection refused when accessing 192.168.99.100:8889

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>docker run -p 8889:8888 -e DOCKER_DIAGNOSTICS_PORT=8889 -e DOCKER_HOST_IP --rm eventuateio/e ventuateio-docker-networking-diagnostics:0.2.0.RELEASE DOCKER_HOST_IP= 192.168.99.100 Server running on port: 8889 About to make HTTP request to self Making HTTP request to self via url= http://192.168.99.100:8889 SUCCESSS!!!!

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>

cer commented 4 years ago

It's like you have to increase the memory - see https://docs.docker.com/docker-for-windows/#docker-settings-dialog

prakashid2 commented 4 years ago

Ok. I will try to do this by tomorrow. But one question. Does it not show or log the error due to OUT_OF_MEMORY on console or where the logs are stored? Cause I think I should check the log first and then try to increase the memory. Do you know where I can find the relevant log?

cer commented 4 years ago

Normally errors show up in the container logs - output of docker logs container However, if containers are exiting without any error messages, I'd suspect its a lack of memory issue that causes termination of the container. You might see messages the docker logs: https://docs.docker.com/config/daemon/

prakashid2 commented 4 years ago

Hey Chris.

I've Windows 8.1 laptop and Docker Desktop can only be installed on Windows 10 or higher. Correct me if I'm wrong. I've installed Docker ToolBox for Windows 8.1. that's DockerToolbox-19.03.1.exe and I don't see Docker Desktop icon in notifications area. How can I set or increase memory?

cer commented 4 years ago

Ic. You need to recreate the Docker Toolbox VM with more memory - google 'docker toolbox memory settings'

prakashid2 commented 4 years ago

Chris.

When I'm working in normal Windows command prompt and when using Docker Quickstart Terminal(opens in MINGW64 CLI. a linux like environment) the output of following command differs

docker run -p 8889:8888 -e DOCKER_DIAGNOSTICS_PORT=8889 -e DOCKER_HOST_IP --rm eventuateio/eventuateio-docker-networking-diagnostics:0.2.0.RELEASE

  1. On Windows normal command prompt I get the Success message. But Chrome or any other browser says 'This site can’t be reached192.168.99.100 refused to connect'. Pls see output below

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>set DOCKER_HOST_IP=192.168.99.100

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>echo %DOCKER_HOST_IP% 192.168.99.100

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>docker run -p 8889:8888 -e DOCKER_DIAGNOSTICS_PORT=8889 -e DOCKER_HOST_IP --rm eventuateio/e ventuateio-docker-networking-diagnostics:0.2.0.RELEASE DOCKER_HOST_IP= 192.168.99.100 Server running on port: 8889 About to make HTTP request to self Making HTTP request to self via url= http://192.168.99.100:8889 SUCCESSS!!!!

D:\AllWorkspaces\manningWorkspace\ftgo-application-master>

  1. But when using MINGW64 terminal I'm not getting Success message and the http://192.168.99.100:8889 is not accessible. Pls see output below

                    ##         .
              ## ## ##        ==
           ## ## ## ## ##    ===
       /"""""""""""""""""\___/ ===
    
           \______ o           __/
             \    \         __/
              \____\_______/

docker is configured to use the default machine with IP 192.168.99.100 For help getting started, check out the docs at https://docs.docker.com

Start interactive shell

Lenovo@PRAKASH-LENOVO MINGW64 /c/Program Files/Docker Toolbox $ pwd /c/Program Files/Docker Toolbox

Lenovo@PRAKASH-LENOVO MINGW64 /c/Program Files/Docker Toolbox $ cd /d/AllWorkspaces/manningWorkspace/ftgo-application-master/

Lenovo@PRAKASH-LENOVO MINGW64 /d/AllWorkspaces/manningWorkspace/ftgo-application-master $ echo $DOCKER_HOST_IP

Lenovo@PRAKASH-LENOVO MINGW64 /d/AllWorkspaces/manningWorkspace/ftgo-application-master $ DOCKER_HOST_IP=192.168.99.100

Lenovo@PRAKASH-LENOVO MINGW64 /d/AllWorkspaces/manningWorkspace/ftgo-application-master $ echo $DOCKER_HOST_IP 192.168.99.100

Lenovo@PRAKASH-LENOVO MINGW64 /d/AllWorkspaces/manningWorkspace/ftgo-application-master $ docker run -p 8889:8888 -e DOCKER_DIAGNOSTICS_PORT=8889 -e DOCKER_HOST_IP --rm eventuateio/eventuateio-docker-networking-diagnostics:0.2.0.RELEASE DOCKER_HOST_IP is not set or is blank

Lenovo@PRAKASH-LENOVO MINGW64 /d/AllWorkspaces/manningWorkspace/ftgo-application-master $

Is there a difference in working in normal Windows command prompt and MINGW64 terminal. In both cases I've set the environment variable correctly. It's refecting using the echo command in both cases. But the docker run -p 8889:8888 -e DOCKER_DIAGNOSTICS_PORT=8889 -e DOCKER_HOST_IP --rm eventuateio/eventuateio-docker-networking-diagnostics:0.2.0.RELEASE seems not working (message: DOCKER_HOST_IP is not set or is blank) on MINGW64 terminal but same command works on normal widnows command prompt(shows Success message). Why So?

Is my DOCKER_HOST_IP enviroment variable not set properly on MINGW64 terminal? Could this be the cause of Swagger UIs also not accessible? Do you know how to overcome this issue?

Pls see attachments

Thanks Prakash S. Mumbai, India DOCKER_HOST_IP_in_MINGW64_cli DOCKER_HOST_IP_in_normal_Windows_command_prompt

prakashid2 commented 4 years ago

TryingToAccessSwaggerUI

Swagger UIs not accessible

prakashid2 commented 4 years ago

docker logs container

Hi Chris.

You pointed out that I may have to start docker-machine with more memory. I did that and now it uses 2 cpus and 6144MB=6GB of memory earlier was with one cpu and 2GB. I correctly followed what you told to bring each service on docker one by one. for e.g. docker-compose up -d kafka docker-compose up -d cdc-service docker-compose up -d ftgo-order-service

and finally ran docker-compose up -d

to bring up all the services and I was monitoring each service status(whether up and running or exited) using docker ps -a. I noticed that 3-4 services got exited. ftgo-order-service also one of the service exited. Hence I ran docker logs container(for ftgo-order-service). The service is exiting not because of out of memory but there is some other exception as you can see the stack trace given below.

-- Exception stack trace for ftgo-order-service -- 2020-04-19 14:42:53.036 ERROR [ftgo-order-service,,,] 1 --- [ main] com.zaxxer.hikari.pool.HikariPool : HikariPool-1 - Exception duri ng pool initialization.

java.sql.SQLException: Access denied for user 'ftgo_order_service_user'@'172.19.0.7' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:965) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3976) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3912) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:871) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1714) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1224) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2190) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2221) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2016) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:776) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:47) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_171] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_171] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_171] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_171] at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:386) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:330) ~[mysql-connector-java-5.1.46.jar!/:5.1.46] at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:117) ~[HikariCP-2.7.9.jar!/:na] at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:123) ~[HikariCP-2.7.9.jar!/:na] at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:365) ~[HikariCP-2.7.9.jar!/:na] at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:194) ~[HikariCP-2.7.9.jar!/:na] at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:460) [HikariCP-2.7.9.jar!/:na] at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:534) [HikariCP-2.7.9.jar!/:na] at com.zaxxer.hikari.pool.HikariPool.(HikariPool.java:115) [HikariCP-2.7.9.jar!/:na] at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:112) [HikariCP-2.7.9.jar!/:na] at com.zaxxer.hikari.HikariDataSource$$FastClassBySpringCGLIB$$eeb1ae86.invoke() [HikariCP-2.7.9.jar!/:na] at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204) [spring-core-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:746) [spring-aop-5.0.7.RELEASE.jar !/:5.0.7.RELEASE] at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) [spring-aop-5.0.7.RELEASE.jar!/:5 .0.7.RELEASE] at org.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:136) [spring-aop-5.0.7.R ELEASE.jar!/:5.0.7.RELEASE] at org.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:124) [spring-aop-5.0.7.RELE ASE.jar!/:5.0.7.RELEASE] at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185) [spring-aop-5.0.7.RELEASE.jar!/:5 .0.7.RELEASE] at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:688) [spring-aop-5.0.7.RELEASE.jar!/ :5.0.7.RELEASE] at com.zaxxer.hikari.HikariDataSource$$EnhancerBySpringCGLIB$$38ad09f0.getConnection() [HikariCP-2.7.9.jar!/:na] at org.springframework.jdbc.datasource.DataSourceUtils.fetchConnection(DataSourceUtils.java:151) [spring-jdbc-5.0.7.RELEASE.jar!/:5.0.7.RELEAS E] at org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(DataSourceUtils.java:115) [spring-jdbc-5.0.7.RELEASE.jar!/:5.0.7.RELEAS E] at org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:78) [spring-jdbc-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:319) [spring-jdbc-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:356) [spring-jdbc-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.boot.autoconfigure.orm.jpa.DatabaseLookup.getDatabase(DatabaseLookup.java:72) [spring-boot-autoconfigure-2.0.3.RELEASE. jar!/:2.0.3.RELEASE] at org.springframework.boot.autoconfigure.orm.jpa.JpaProperties.determineDatabase(JpaProperties.java:166) [spring-boot-autoconfigure-2.0.3.REL EASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.autoconfigure.orm.jpa.JpaBaseConfiguration.jpaVendorAdapter(JpaBaseConfiguration.java:111) [spring-boot-autoconfig ure-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaConfiguration$$EnhancerBySpringCGLIB$$6b54669.CGLIB$jpaVendorAdapter$4(<generate d>) [spring-boot-autoconfigure-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaConfiguration$$EnhancerBySpringCGLIB$$6b54669$$FastClassBySpringCGLIB$$86eb77f.i nvoke() [spring-boot-autoconfigure-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228) [spring-core-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:361) [spr ing-context-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaConfiguration$$EnhancerBySpringCGLIB$$6b54669.jpaVendorAdapter() [spr ing-boot-autoconfigure-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171] at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171] at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154) [spring-beans-5.0.7 .RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:582) [spring-beans-5.0 .7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFacto ry.java:1256) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:110 5) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543) [spr ing-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) [sprin g-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) [spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) [spring-beans-5.0.7.RELEASE.jar!/:5.0 .7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7 .RELEASE] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251) ~[spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1138) ~[spring-bea ns-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1065) ~[spring-beans -5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:818) [spring-beans-5.0.7.RE LEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:724) [spring-beans-5.0.7.RELEASE .jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:474) [spring-beans-5.0 .7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFacto ry.java:1256) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:110 5) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543) [spr ing-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) [sprin g-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) [spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) [spring-beans-5.0.7.RELEASE.jar!/:5.0 .7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7 .RELEASE] at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:251) ~[spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1138) ~[spring-bea ns-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1065) ~[spring-beans -5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:818) [spring-beans-5.0.7.RE LEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:724) [spring-beans-5.0.7.RELEASE .jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:474) [spring-beans-5.0 .7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFacto ry.java:1256) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:110 5) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:543) [spr ing-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:503) [sprin g-beans-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:317) [spring-beans-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) ~[spring-beans-5 .0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:315) [spring-beans-5.0.7.RELEASE.jar!/:5.0 .7.RELEASE] at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) [spring-beans-5.0.7.RELEASE.jar!/:5.0.7 .RELEASE] at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1089) ~[spring-context-5.0.7.RELEASE .jar!/:5.0.7.RELEASE] at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:859) ~[sprin g-context-5.0.7.RELEASE.jar!/:5.0.7.RELEASE] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:550) ~[spring-context-5.0.7.RELEASE. jar!/:5.0.7.RELEASE] at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140) ~[spri ng-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:759) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:395) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:327) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1255) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at org.springframework.boot.SpringApplication.run(SpringApplication.java:1243) ~[spring-boot-2.0.3.RELEASE.jar!/:2.0.3.RELEASE] at net.chrisrichardson.ftgo.orderservice.main.OrderServiceMain.main(OrderServiceMain.java:23) ~[classes!/:na] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_171] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_171] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_171] at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_171] at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) ~[ftgo-order-service.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) ~[ftgo-order-service.jar:na] at org.springframework.boot.loader.Launcher.launch(Launcher.java:50) ~[ftgo-order-service.jar:na] at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51) ~[ftgo-order-service.jar:na]

2020-04-19 14:42:53.057 WARN [ftgo-order-service,,,] 1 --- [ main] o.s.b.a.orm.jpa.DatabaseLookup : Unable to determine jdbc url from datasource


Please suggest how can I overcome this issue.

Thanks Prakash S. Mumbai, India

prakashid2 commented 4 years ago

Mistakenly clicked Close button. Reopened again

cer commented 4 years ago

ava.sql.SQLException: Access denied for user 'ftgo_order_service_user'@'172.19.0.7' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:965) ~[mysql-connector-java-5.1.46.jar!/:5.1.46]

This is a database problem. What's in the mysql container's log?

prakashid2 commented 4 years ago

-- mysql container's log below --

D:\intelliJWorkspaces\manningWorkspace\ftgo-application> D:\intelliJWorkspaces\manningWorkspace\ftgo-application>docker logs 4f61d23cafb7 Initializing database 2020-04-15T13:10:06.741249Z 0 [Warning] InnoDB: New log files created, LSN=45790 2020-04-15T13:10:07.107836Z 0 [Warning] InnoDB: Creating foreign key constraint system tables. 2020-04-15T13:10:07.211231Z 0 [Warning] No existing UUID has been found, so we assume that this is the first time that this server has been started. G enerating a new UUID: 6d8ddc3c-7f1a-11ea-93f4-0242ac120004. 2020-04-15T13:10:07.212138Z 0 [Warning] Gtid table is not ready to be used. Table 'mysql.gtid_executed' cannot be opened. 2020-04-15T13:10:07.213112Z 1 [Warning] root@localhost is created with an empty password ! Please consider switching off the --initialize-insecure opt ion. 2020-04-15T13:10:10.398184Z 1 [Warning] 'user' entry 'root@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:10.402872Z 1 [Warning] 'user' entry 'mysql.sys@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:10.403030Z 1 [Warning] 'db' entry 'sys mysql.sys@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:10.403087Z 1 [Warning] 'proxies_priv' entry '@ root@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:10.403166Z 1 [Warning] 'tables_priv' entry 'sys_config mysql.sys@localhost' ignored in --skip-name-resolve mode. Database initialized MySQL init process in progress... MySQL init process in progress... 2020-04-15T13:10:16.102565Z 0 [Note] mysqld (mysqld 5.7.13-log) starting as process 49 ... 2020-04-15T13:10:16.227844Z 0 [Note] InnoDB: PUNCH HOLE support available 2020-04-15T13:10:16.228046Z 0 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins 2020-04-15T13:10:16.228092Z 0 [Note] InnoDB: Uses event mutexes 2020-04-15T13:10:16.228313Z 0 [Note] InnoDB: GCC builtin __atomic_thread_fence() is used for memory barrier 2020-04-15T13:10:16.228827Z 0 [Note] InnoDB: Compressed tables use zlib 1.2.8 2020-04-15T13:10:16.232164Z 0 [Note] InnoDB: Using Linux native AIO 2020-04-15T13:10:16.237667Z 0 [Note] InnoDB: Number of pools: 1 2020-04-15T13:10:16.244032Z 0 [Note] InnoDB: Using CPU crc32 instructions 2020-04-15T13:10:16.265154Z 0 [Note] InnoDB: Initializing buffer pool, total size = 128M, instances = 1, chunk size = 128M 2020-04-15T13:10:16.384853Z 0 [Note] InnoDB: Completed initialization of buffer pool 2020-04-15T13:10:16.466136Z 0 [Note] InnoDB: If the mysqld execution user is authorized, page cleaner thread priority can be changed. See the man page of setpriority(). 2020-04-15T13:10:16.502202Z 0 [Note] InnoDB: Highest supported file format is Barracuda. 2020-04-15T13:10:16.678977Z 0 [Note] InnoDB: Creating shared tablespace for temporary tables 2020-04-15T13:10:16.681815Z 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ... 2020-04-15T13:10:16.921331Z 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB. 2020-04-15T13:10:16.928939Z 0 [Note] InnoDB: 96 redo rollback segment(s) found. 96 redo rollback segment(s) are active. 2020-04-15T13:10:16.948207Z 0 [Note] InnoDB: 32 non-redo rollback segment(s) are active. 2020-04-15T13:10:16.952514Z 0 [Note] InnoDB: Waiting for purge to start 2020-04-15T13:10:17.172783Z 0 [Note] InnoDB: 5.7.13 started; log sequence number 2525487 2020-04-15T13:10:17.200312Z 0 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool 2020-04-15T13:10:17.203289Z 0 [Note] Plugin 'FEDERATED' is disabled. 2020-04-15T13:10:17.462227Z 0 [Note] InnoDB: Buffer pool(s) load completed at 200415 13:10:17 2020-04-15T13:10:17.586245Z 0 [Warning] Failed to set up SSL because of the following SSL library error: SSL context is not usable without certificate and private key MySQL init process in progress... 2020-04-15T13:10:17.695888Z 0 [Warning] 'user' entry 'root@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:17.696569Z 0 [Warning] 'user' entry 'mysql.sys@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:17.700589Z 0 [Warning] 'db' entry 'sys mysql.sys@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:17.702099Z 0 [Warning] 'proxies_priv' entry '@ root@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:18.363557Z 0 [Warning] 'tables_priv' entry 'sys_config mysql.sys@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:18.840328Z 0 [Note] Event Scheduler: Loaded 0 events 2020-04-15T13:10:18.844081Z 0 [Note] mysqld: ready for connections. Version: '5.7.13-log' socket: '/var/run/mysqld/mysqld.sock' port: 0 MySQL Community Server (GPL) Warning: Unable to load '/usr/share/zoneinfo/Factory' as time zone. Skipping it. Warning: Unable to load '/usr/share/zoneinfo/iso3166.tab' as time zone. Skipping it. Warning: Unable to load '/usr/share/zoneinfo/leap-seconds.list' as time zone. Skipping it. Warning: Unable to load '/usr/share/zoneinfo/posix/Factory' as time zone. Skipping it. Warning: Unable to load '/usr/share/zoneinfo/right/Factory' as time zone. Skipping it. Warning: Unable to load '/usr/share/zoneinfo/zone.tab' as time zone. Skipping it. 2020-04-15T13:10:50.144225Z 4 [Warning] 'db' entry 'sys mysql.sys@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:50.151738Z 4 [Warning] 'proxies_priv' entry '@ root@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:50.162170Z 4 [Warning] 'tables_priv' entry 'sys_config mysql.sys@localhost' ignored in --skip-name-resolve mode. mysql: [Warning] Using a password on the command line interface can be insecure. mysql: [Warning] Using a password on the command line interface can be insecure. 2020-04-15T13:10:50.315767Z 6 [Warning] 'db' entry 'sys mysql.sys@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:50.316742Z 6 [Warning] 'proxies_priv' entry '@ root@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:10:50.317343Z 6 [Warning] 'tables_priv' entry 'sys_config mysql.sys@localhost' ignored in --skip-name-resolve mode.

/usr/local/bin/docker-entrypoint.sh: running /docker-entrypoint-initdb.d/3.common-schema.sql mysql: [Warning] Using a password on the command line interface can be insecure.

/usr/local/bin/docker-entrypoint.sh: running /docker-entrypoint-initdb.d/4.compile-schema-per-service.sh

/usr/local/bin/docker-entrypoint.sh: running /docker-entrypoint-initdb.d/5.schema-per-service.sql mysql: [Warning] Using a password on the command line interface can be insecure.

/usr/local/bin/docker-entrypoint.sh: ignoring /docker-entrypoint-initdb.d/template

2020-04-15T13:10:54.082804Z 0 [Note] Giving 0 client threads a chance to die gracefully 2020-04-15T13:10:54.084183Z 0 [Note] Shutting down slave threads 2020-04-15T13:10:54.085677Z 0 [Note] Forcefully disconnecting 0 remaining clients 2020-04-15T13:10:54.085853Z 0 [Note] Event Scheduler: Purging the queue. 0 events 2020-04-15T13:10:54.093105Z 0 [Note] Binlog end 2020-04-15T13:10:54.104765Z 0 [Note] Shutting down plugin 'ngram' 2020-04-15T13:10:54.105106Z 0 [Note] Shutting down plugin 'BLACKHOLE' 2020-04-15T13:10:54.105204Z 0 [Note] Shutting down plugin 'partition' 2020-04-15T13:10:54.105246Z 0 [Note] Shutting down plugin 'ARCHIVE' 2020-04-15T13:10:54.105284Z 0 [Note] Shutting down plugin 'MEMORY' 2020-04-15T13:10:54.106003Z 0 [Note] Shutting down plugin 'INNODB_SYS_VIRTUAL' 2020-04-15T13:10:54.106217Z 0 [Note] Shutting down plugin 'INNODB_SYS_DATAFILES' 2020-04-15T13:10:54.106332Z 0 [Note] Shutting down plugin 'INNODB_SYS_TABLESPACES' 2020-04-15T13:10:54.106438Z 0 [Note] Shutting down plugin 'INNODB_SYS_FOREIGN_COLS' 2020-04-15T13:10:54.106495Z 0 [Note] Shutting down plugin 'INNODB_SYS_FOREIGN' 2020-04-15T13:10:54.106973Z 0 [Note] Shutting down plugin 'INNODB_SYS_FIELDS' 2020-04-15T13:10:54.107043Z 0 [Note] Shutting down plugin 'INNODB_SYS_COLUMNS' 2020-04-15T13:10:54.107160Z 0 [Note] Shutting down plugin 'INNODB_SYS_INDEXES' 2020-04-15T13:10:54.107872Z 0 [Note] Shutting down plugin 'INNODB_SYS_TABLESTATS' 2020-04-15T13:10:54.108187Z 0 [Note] Shutting down plugin 'INNODB_SYS_TABLES' 2020-04-15T13:10:54.108322Z 0 [Note] Shutting down plugin 'INNODB_FT_INDEX_TABLE' 2020-04-15T13:10:54.108367Z 0 [Note] Shutting down plugin 'INNODB_FT_INDEX_CACHE' 2020-04-15T13:10:54.108404Z 0 [Note] Shutting down plugin 'INNODB_FT_CONFIG' 2020-04-15T13:10:54.108441Z 0 [Note] Shutting down plugin 'INNODB_FT_BEING_DELETED' 2020-04-15T13:10:54.108477Z 0 [Note] Shutting down plugin 'INNODB_FT_DELETED' 2020-04-15T13:10:54.108858Z 0 [Note] Shutting down plugin 'INNODB_FT_DEFAULT_STOPWORD' 2020-04-15T13:10:54.108958Z 0 [Note] Shutting down plugin 'INNODB_METRICS' 2020-04-15T13:10:54.109010Z 0 [Note] Shutting down plugin 'INNODB_TEMP_TABLE_INFO' 2020-04-15T13:10:54.109698Z 0 [Note] Shutting down plugin 'INNODB_BUFFER_POOL_STATS' 2020-04-15T13:10:54.110090Z 0 [Note] Shutting down plugin 'INNODB_BUFFER_PAGE_LRU' 2020-04-15T13:10:54.110746Z 0 [Note] Shutting down plugin 'INNODB_BUFFER_PAGE' 2020-04-15T13:10:54.111076Z 0 [Note] Shutting down plugin 'INNODB_CMP_PER_INDEX_RESET' 2020-04-15T13:10:54.111143Z 0 [Note] Shutting down plugin 'INNODB_CMP_PER_INDEX' 2020-04-15T13:10:54.111194Z 0 [Note] Shutting down plugin 'INNODB_CMPMEM_RESET' 2020-04-15T13:10:54.111232Z 0 [Note] Shutting down plugin 'INNODB_CMPMEM' 2020-04-15T13:10:54.111269Z 0 [Note] Shutting down plugin 'INNODB_CMP_RESET' 2020-04-15T13:10:54.111847Z 0 [Note] Shutting down plugin 'INNODB_CMP' 2020-04-15T13:10:54.111912Z 0 [Note] Shutting down plugin 'INNODB_LOCK_WAITS' 2020-04-15T13:10:54.111952Z 0 [Note] Shutting down plugin 'INNODB_LOCKS' 2020-04-15T13:10:54.112049Z 0 [Note] Shutting down plugin 'INNODB_TRX' 2020-04-15T13:10:54.112104Z 0 [Note] Shutting down plugin 'InnoDB' 2020-04-15T13:10:54.119538Z 0 [Note] InnoDB: FTS optimize thread exiting. 2020-04-15T13:10:54.125859Z 0 [Note] InnoDB: Starting shutdown... 2020-04-15T13:10:54.227957Z 0 [Note] InnoDB: Dumping buffer pool(s) to /var/lib/mysql/ib_buffer_pool 2020-04-15T13:10:54.228289Z 0 [Note] InnoDB: Buffer pool(s) dump completed at 200415 13:10:54 2020-04-15T13:10:57.640354Z 0 [Note] InnoDB: Shutdown completed; log sequence number 12577494 2020-04-15T13:10:57.646100Z 0 [Note] InnoDB: Removed temporary tablespace data file: "ibtmp1" 2020-04-15T13:10:57.646328Z 0 [Note] Shutting down plugin 'MyISAM' 2020-04-15T13:10:57.647010Z 0 [Note] Shutting down plugin 'MRG_MYISAM' 2020-04-15T13:10:57.650283Z 0 [Note] Shutting down plugin 'CSV' 2020-04-15T13:10:57.653426Z 0 [Note] Shutting down plugin 'PERFORMANCE_SCHEMA' 2020-04-15T13:10:57.655059Z 0 [Note] Shutting down plugin 'sha256_password' 2020-04-15T13:10:57.655439Z 0 [Note] Shutting down plugin 'mysql_native_password' 2020-04-15T13:10:57.666297Z 0 [Note] Shutting down plugin 'binlog' 2020-04-15T13:10:57.697752Z 0 [Note] mysqld: Shutdown complete

MySQL init process done. Ready for start up.

2020-04-15T13:11:00.225153Z 0 [Note] mysqld (mysqld 5.7.13-log) starting as process 1 ... 2020-04-15T13:11:00.733943Z 0 [Note] InnoDB: PUNCH HOLE support available 2020-04-15T13:11:00.736699Z 0 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins 2020-04-15T13:11:00.737062Z 0 [Note] InnoDB: Uses event mutexes 2020-04-15T13:11:00.737184Z 0 [Note] InnoDB: GCC builtin __atomic_thread_fence() is used for memory barrier 2020-04-15T13:11:00.737474Z 0 [Note] InnoDB: Compressed tables use zlib 1.2.8 2020-04-15T13:11:00.737603Z 0 [Note] InnoDB: Using Linux native AIO 2020-04-15T13:11:00.749952Z 0 [Note] InnoDB: Number of pools: 1 2020-04-15T13:11:00.765989Z 0 [Note] InnoDB: Using CPU crc32 instructions 2020-04-15T13:11:00.852956Z 0 [Note] InnoDB: Initializing buffer pool, total size = 128M, instances = 1, chunk size = 128M 2020-04-15T13:11:01.057858Z 0 [Note] InnoDB: Completed initialization of buffer pool 2020-04-15T13:11:01.106705Z 0 [Note] InnoDB: If the mysqld execution user is authorized, page cleaner thread priority can be changed. See the man page of setpriority(). 2020-04-15T13:11:01.180264Z 0 [Note] InnoDB: Highest supported file format is Barracuda. 2020-04-15T13:11:01.498252Z 0 [Note] InnoDB: Creating shared tablespace for temporary tables 2020-04-15T13:11:01.505374Z 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ... 2020-04-15T13:11:01.724865Z 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB. 2020-04-15T13:11:01.726379Z 0 [Note] InnoDB: 96 redo rollback segment(s) found. 96 redo rollback segment(s) are active. 2020-04-15T13:11:01.738880Z 0 [Note] InnoDB: 32 non-redo rollback segment(s) are active. 2020-04-15T13:11:01.739464Z 0 [Note] InnoDB: Waiting for purge to start 2020-04-15T13:11:01.828422Z 0 [Note] InnoDB: 5.7.13 started; log sequence number 12577494 2020-04-15T13:11:01.840995Z 0 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool 2020-04-15T13:11:01.841477Z 0 [Note] Plugin 'FEDERATED' is disabled. 2020-04-15T13:11:02.394097Z 0 [Note] InnoDB: Buffer pool(s) load completed at 200415 13:11:02 2020-04-15T13:11:02.596792Z 0 [Warning] Failed to set up SSL because of the following SSL library error: SSL context is not usable without certificate and private key 2020-04-15T13:11:02.597011Z 0 [Note] Server hostname (bind-address): '*'; port: 3306 2020-04-15T13:11:02.602675Z 0 [Note] IPv6 is available. 2020-04-15T13:11:02.607794Z 0 [Note] - '::' resolves to '::'; 2020-04-15T13:11:02.608115Z 0 [Note] Server socket created on IP: '::'. 2020-04-15T13:11:02.663550Z 0 [Warning] 'db' entry 'sys mysql.sys@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:11:02.672863Z 0 [Warning] 'proxies_priv' entry '@ root@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:11:02.802841Z 0 [Warning] 'tables_priv' entry 'sys_config mysql.sys@localhost' ignored in --skip-name-resolve mode. 2020-04-15T13:11:03.356342Z 0 [Note] Event Scheduler: Loaded 0 events 2020-04-15T13:11:03.430803Z 0 [Note] mysqld: ready for connections. Version: '5.7.13-log' socket: '/var/run/mysqld/mysqld.sock' port: 3306 MySQL Community Server (GPL) 2020-04-15T14:00:50.453847Z 55 [Note] Aborted connection 55 to db: 'ftgo_restaurant_service' user: 'ftgo_restaurant_service_user' host: '172.18.0.14' (Got an error reading communication packets) 2020-04-15T14:00:50.644494Z 57 [Note] Aborted connection 57 to db: 'ftgo_restaurant_service' user: 'ftgo_restaurant_service_user' host: '172.18.0.14' (Got an error reading communication packets) 2020-04-15T14:00:50.676003Z 52 [Note] Aborted connection 52 to db: 'ftgo_restaurant_service' user: 'ftgo_restaurant_service_user' host: '172.18.0.14' (Got an error reading communication packets) 2020-04-15T14:00:50.676221Z 58 [Note] Aborted connection 58 to db: 'ftgo_restaurant_service' user: 'ftgo_restaurant_service_user' host: '172.18.0.14' (Got an error reading communication packets) 2020-04-15T14:00:50.933107Z 53 [Note] Aborted connection 53 to db: 'ftgo_restaurant_service' user: 'ftgo_restaurant_service_user' host: '172.18.0.14' (Got an error reading communication packets) 2020-04-15T14:00:50.948752Z 60 [Note] Aborted connection 60 to db: 'ftgo_restaurant_service' user: 'ftgo_restaurant_service_user' host: '172.18.0.14' (Got an error reading communication packets) 2020-04-15T14:00:51.026463Z 54 [Note] Aborted connection 54 to db: 'ftgo_restaurant_service' user: 'ftgo_restaurant_service_user' host: '172.18.0.14' (Got an error reading communication packets) 2020-04-15T14:00:51.073940Z 61 [Note] Aborted connection 61 to db: 'ftgo_restaurant_service' user: 'ftgo_restaurant_service_user' host: '172.18.0.14' (Got an error reading communication packets) 2020-04-15T14:00:51.143612Z 56 [Note] Aborted connection 56 to db: 'ftgo_restaurant_service' user: 'ftgo_restaurant_service_user' host: '172.18.0.14' (Got an error reading communication packets) 2020-04-15T14:00:51.255143Z 59 [Note] Aborted connection 59 to db: 'ftgo_restaurant_service' user: 'ftgo_restaurant_service_user' host: '172.18.0.14' (Got an error reading communication packets) 2020-04-15T15:29:15.940918Z 0 [Note] InnoDB: page_cleaner: 1000ms intended loop took 2380488ms. The settings might not be optimal. (flushed=0 and evic ted=0, during the time.) 2020-04-15T16:26:49.996167Z 0 [Note] InnoDB: page_cleaner: 1000ms intended loop took 2784546ms. The settings might not be optimal. (flushed=0 and evic ted=0, during the time.) 2020-04-15T17:14:31.780885Z 112 [Note] Start binlog_dump to master_thread_id(112) slave_server(1), pos(, 4) 2020-04-15T17:38:49.911542Z 0 [Note] InnoDB: page_cleaner: 1000ms intended loop took 8304ms. The settings might not be optimal. (flushed=3 and evicted =0, during the time.) 2020-04-15T17:40:43.732187Z 122 [Note] Aborted connection 122 to db: 'ftgo_order_service' user: 'ftgo_order_service_user' host: '172.18.0.10' (Got an error reading communication packets) 2020-04-15T17:40:43.732202Z 117 [Note] Aborted connection 117 to db: 'ftgo_order_service' user: 'ftgo_order_service_user' host: '172.18.0.10' (Got an error reading communication packets) 2020-04-15T17:40:43.732194Z 119 [Note] Aborted connection 119 to db: 'ftgo_order_service' user: 'ftgo_order_service_user' host: '172.18.0.10' (Got an error reading communication packets) 2020-04-15T17:40:43.732100Z 114 [Note] Aborted connection 114 to db: 'ftgo_order_service' user: 'ftgo_order_service_user' host: '172.18.0.10' (Got an error reading communication packets) 2020-04-15T17:40:43.732139Z 116 [Note] Aborted connection 116 to db: 'ftgo_order_service' user: 'ftgo_order_service_user' host: '172.18.0.10' (Got an error reading communication packets) 2020-04-15T17:40:43.740042Z 115 [Note] Aborted connection 115 to db: 'ftgo_order_service' user: 'ftgo_order_service_user' host: '172.18.0.10' (Got an error reading communication packets) 2020-04-15T17:40:43.736358Z 121 [Note] Aborted connection 121 to db: 'ftgo_order_service' user: 'ftgo_order_service_user' host: '172.18.0.10' (Got an error reading communication packets) 2020-04-15T17:40:43.724979Z 120 [Note] Aborted connection 120 to db: 'ftgo_order_service' user: 'ftgo_order_service_user' host: '172.18.0.10' (Got an error reading communication packets) 2020-04-15T17:40:43.999578Z 118 [Note] Aborted connection 118 to db: 'ftgo_order_service' user: 'ftgo_order_service_user' host: '172.18.0.10' (Got an error reading communication packets) 2020-04-15T17:40:44.041793Z 113 [Note] Aborted connection 113 to db: 'ftgo_order_service' user: 'ftgo_order_service_user' host: '172.18.0.10' (Got an error reading communication packets) 2020-04-18T07:13:52.096003Z 0 [Note] mysqld (mysqld 5.7.13-log) starting as process 1 ... 2020-04-18T07:13:52.224916Z 0 [Note] InnoDB: PUNCH HOLE support available 2020-04-18T07:13:52.225073Z 0 [Note] InnoDB: Mutexes and rw_locks use GCC atomic builtins 2020-04-18T07:13:52.225117Z 0 [Note] InnoDB: Uses event mutexes 2020-04-18T07:13:52.225150Z 0 [Note] InnoDB: GCC builtin __atomic_thread_fence() is used for memory barrier 2020-04-18T07:13:52.225182Z 0 [Note] InnoDB: Compressed tables use zlib 1.2.8 2020-04-18T07:13:52.225214Z 0 [Note] InnoDB: Using Linux native AIO 2020-04-18T07:13:52.308628Z 0 [Note] InnoDB: Number of pools: 1 2020-04-18T07:13:52.386475Z 0 [Note] InnoDB: Using CPU crc32 instructions 2020-04-18T07:13:52.722741Z 0 [Note] InnoDB: Initializing buffer pool, total size = 128M, instances = 1, chunk size = 128M 2020-04-18T07:13:52.829039Z 0 [Note] InnoDB: Completed initialization of buffer pool 2020-04-18T07:13:52.928039Z 0 [Note] InnoDB: If the mysqld execution user is authorized, page cleaner thread priority can be changed. See the man page of setpriority(). 2020-04-18T07:13:53.084483Z 0 [Note] InnoDB: Highest supported file format is Barracuda. 2020-04-18T07:13:53.174920Z 0 [Note] InnoDB: Log scan progressed past the checkpoint lsn 12955149 2020-04-18T07:13:53.175152Z 0 [Note] InnoDB: Doing recovery: scanned up to log sequence number 12955593 2020-04-18T07:13:53.175429Z 0 [Note] InnoDB: Doing recovery: scanned up to log sequence number 12955593 2020-04-18T07:13:53.248180Z 0 [Note] InnoDB: Database was not shutdown normally! 2020-04-18T07:13:53.248915Z 0 [Note] InnoDB: Starting crash recovery. 2020-04-18T07:13:54.130761Z 0 [Note] InnoDB: Starting an apply batch of log records to the database... InnoDB: Progress in percent: 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 2020-04-18T07:13:54.140234Z 0 [Note] InnoDB: Apply batch completed 2020-04-18T07:13:54.144374Z 0 [Note] InnoDB: Last MySQL binlog file position 0 214200, file name mysql-bin.000003 2020-04-18T07:13:56.396950Z 0 [Note] InnoDB: Removed temporary tablespace data file: "ibtmp1" 2020-04-18T07:13:56.397085Z 0 [Note] InnoDB: Creating shared tablespace for temporary tables 2020-04-18T07:13:56.397196Z 0 [Note] InnoDB: Setting file './ibtmp1' size to 12 MB. Physically writing the file full; Please wait ... 2020-04-18T07:13:56.530420Z 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB. 2020-04-18T07:13:56.537638Z 0 [Note] InnoDB: 96 redo rollback segment(s) found. 96 redo rollback segment(s) are active. 2020-04-18T07:13:56.539102Z 0 [Note] InnoDB: 32 non-redo rollback segment(s) are active. 2020-04-18T07:13:56.539656Z 0 [Note] InnoDB: Waiting for purge to start 2020-04-18T07:13:56.595680Z 0 [Note] InnoDB: 5.7.13 started; log sequence number 12955593 2020-04-18T07:13:56.615211Z 0 [Note] Plugin 'FEDERATED' is disabled. 2020-04-18T07:13:56.675438Z 0 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool 2020-04-18T07:13:56.864487Z 0 [Note] InnoDB: Buffer pool(s) load completed at 200418 7:13:56 2020-04-18T07:13:56.886345Z 0 [Note] Recovering after a crash using mysql-bin 2020-04-18T07:13:57.155669Z 0 [Note] Starting crash recovery... 2020-04-18T07:13:57.158952Z 0 [Note] Crash recovery finished. 2020-04-18T07:13:57.448942Z 0 [Warning] Failed to set up SSL because of the following SSL library error: SSL context is not usable without certificate and private key 2020-04-18T07:13:57.449311Z 0 [Note] Server hostname (bind-address): '*'; port: 3306 2020-04-18T07:13:57.449468Z 0 [Note] IPv6 is available. 2020-04-18T07:13:57.449527Z 0 [Note] - '::' resolves to '::'; 2020-04-18T07:13:57.449605Z 0 [Note] Server socket created on IP: '::'. 2020-04-18T07:13:57.720495Z 0 [Warning] 'db' entry 'sys mysql.sys@localhost' ignored in --skip-name-resolve mode. 2020-04-18T07:13:57.722192Z 0 [Warning] 'proxies_priv' entry '@ root@localhost' ignored in --skip-name-resolve mode. 2020-04-18T07:13:58.360160Z 0 [Warning] 'tables_priv' entry 'sys_config mysql.sys@localhost' ignored in --skip-name-resolve mode. 2020-04-18T07:14:02.644486Z 0 [Note] Event Scheduler: Loaded 0 events 2020-04-18T07:14:02.681781Z 0 [Note] mysqld: ready for connections. Version: '5.7.13-log' socket: '/var/run/mysqld/mysqld.sock' port: 3306 MySQL Community Server (GPL) 2020-04-18T08:37:09.388117Z 0 [Note] InnoDB: page_cleaner: 1000ms intended loop took 1612449ms. The settings might not be optimal. (flushed=0 and evic ted=0, during the time.)

D:\intelliJWorkspaces\manningWorkspace\ftgo-application>

cer commented 4 years ago

Not quite sure what is going wrong. The logs seem fine.

One question: does docker ps show the cdc service as healthy after it starts?

cer commented 4 years ago

Also, please

  1. git checkout wip-use-gradle-docker-compose
  2. ./gradlew compileAll assemble
  3. ./gradlew :composeUp

I am curious to see the output of the last command

prakashid2 commented 4 years ago

Tried docker-compose up -d cdc-service. and then tried 'docker ps' after one minute. Output below D:\intelliJWorkspaces\manningWorkspace\ftgo-application>docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 4c1026381ea0 eventuateio/eventuate-cdc-service:0.4.0.RELEASE "/bin/sh -c 'java ${" 4 minutes ago Up 2 minutes (unhealthy) 0.0.0.0:8 099->8080/tcp ftgo-application_cdc-service_1 e441177c9455 eventuateio/eventuate-kafka:0.3.0.RELEASE "/bin/bash -c ./run-" 5 minutes ago Up 2 minutes 0.0.0.0:9 092->9092/tcp ftgo-application_kafka_1 8b92b6a275e5 eventuateio/eventuate-zookeeper:0.4.0.RELEASE "/usr/local/zookeepe" 5 minutes ago Up 2 minutes 2888/tcp, 0.0.0.0:2181->2181/tcp, 3888/tcp ftgo-application_zookeeper_1 7b6d9f6411f0 ftgo-application-master_dynamodblocal-init "/bin/sh -c './wait-" 6 days ago Up 18 seconds (healthy) ftgo-application-master_dynamodblocal-init_1

D:\intelliJWorkspaces\manningWorkspace\ftgo-application>

Since the cdc-service is showing unhealthy I obtained it's container log below. -- cdc-service container log start --

D:\intelliJWorkspaces\manningWorkspace\ftgo-application>docker logs 4c1026381ea0 #######

####### ## ###### # # # #### # # # ######

14:43:11.670 [main] INFO o.s.b.w.e.tomcat.TomcatWebServer - Tomcat initialized with port(s): 8080 (http) 14:43:11.911 [main] INFO o.s.web.context.ContextLoader - Root WebApplicationContext: initialization completed in 5548 ms 14:43:13.918 [main] INFO o.s.s.c.ThreadPoolTaskExecutor - Initializing ExecutorService 'applicationTaskExecutor' 14:43:14.148 [main] WARN o.a.c.retry.ExponentialBackoffRetry - maxRetries too large (2147483647). Pinning to 29 14:43:14.584 [main] INFO o.a.c.f.imps.CuratorFrameworkImpl - Starting 14:43:14.695 [main-EventThread] INFO o.a.c.f.state.ConnectionStateManager - State change: CONNECTED 14:43:14.787 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:14.852 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = producer-1 ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:14.855 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version : 0.10.0.1 14:43:14.855 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId : a7a17cdec9eaa6c5 14:43:15.118 [main] INFO i.e.l.u.c.p.CdcPipelineConfigurator - Starting unified cdc pipelines 14:43:15.376 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.377 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.387 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = producer-2 ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.388 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version : 0.10.0.1 14:43:15.388 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId : a7a17cdec9eaa6c5 14:43:15.388 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.388 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.390 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.398 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = producer-3 ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.399 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version : 0.10.0.1 14:43:15.399 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId : a7a17cdec9eaa6c5 14:43:15.399 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.400 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.401 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.407 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = producer-4 ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.408 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version : 0.10.0.1 14:43:15.408 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId : a7a17cdec9eaa6c5 14:43:15.409 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.409 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.410 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.414 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = producer-5 ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.414 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version : 0.10.0.1 14:43:15.415 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId : a7a17cdec9eaa6c5 14:43:15.415 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.415 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.416 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.422 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = producer-6 ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.423 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version : 0.10.0.1 14:43:15.424 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId : a7a17cdec9eaa6c5 14:43:15.424 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.424 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.426 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.440 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = producer-7 ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.441 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version : 0.10.0.1 14:43:15.441 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId : a7a17cdec9eaa6c5 14:43:15.441 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.442 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.442 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.446 [main] INFO o.a.k.c.producer.ProducerConfig - ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS sasl.mechanism = GSSAPI max.block.ms = 60000 interceptor.classes = null ssl.truststore.password = null client.id = producer-8 ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = all receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = null ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 1

14:43:15.446 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version : 0.10.0.1 14:43:15.446 [main] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId : a7a17cdec9eaa6c5 14:43:15.447 [main] DEBUG i.e.local.common.CdcDataPublisher - Starting CdcDataPublisher 14:43:15.489 [main] INFO i.e.l.u.c.p.CdcPipelineConfigurator - Unified cdc pipelines are started 14:43:15.551 [Curator-LeaderSelector-0] WARN org.apache.curator.utils.ZKPaths - The version of ZooKeeper being used doesn't support Container nodes. CreateMode.PERSISTENT will be used instead. 14:43:15.591 [Curator-LeaderSelector-0] INFO i.e.l.m.binlog.MySqlBinaryLogClient - mysql binlog client started 14:43:15.725 [Curator-LeaderSelector-0] INFO o.a.k.c.consumer.ConsumerConfig - ConsumerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor] reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 max.partition.fetch.bytes = 1048576 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS enable.auto.commit = false sasl.mechanism = GSSAPI interceptor.classes = null exclude.internal.topics = true ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null max.poll.records = 2147483647 check.crcs = true request.timeout.ms = 40000 heartbeat.interval.ms = 3000 auto.commit.interval.ms = 1000 receive.buffer.bytes = 65536 ssl.truststore.type = JKS ssl.truststore.location = null ssl.keystore.password = null fetch.min.bytes = 1 send.buffer.bytes = 131072 value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer group.id = c3de8eb3-d26b-4783-9604-4583929bcd1f retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX ssl.key.password = null fetch.max.wait.ms = 500 sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 session.timeout.ms = 30000 metrics.num.samples = 2 key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 auto.offset.reset = earliest

14:43:15.775 [Curator-LeaderSelector-0] INFO o.a.k.c.consumer.ConsumerConfig - ConsumerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor] reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 max.partition.fetch.bytes = 1048576 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS enable.auto.commit = false sasl.mechanism = GSSAPI interceptor.classes = null exclude.internal.topics = true ssl.truststore.password = null client.id = consumer-1 ssl.endpoint.identification.algorithm = null max.poll.records = 2147483647 check.crcs = true request.timeout.ms = 40000 heartbeat.interval.ms = 3000 auto.commit.interval.ms = 1000 receive.buffer.bytes = 65536 ssl.truststore.type = JKS ssl.truststore.location = null ssl.keystore.password = null fetch.min.bytes = 1 send.buffer.bytes = 131072 value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer group.id = c3de8eb3-d26b-4783-9604-4583929bcd1f retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX ssl.key.password = null fetch.max.wait.ms = 500 sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 session.timeout.ms = 30000 metrics.num.samples = 2 key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 auto.offset.reset = earliest

14:43:15.920 [Curator-LeaderSelector-0] INFO o.a.kafka.common.utils.AppInfoParser - Kafka version : 0.10.0.1 14:43:15.920 [Curator-LeaderSelector-0] INFO o.a.kafka.common.utils.AppInfoParser - Kafka commitId : a7a17cdec9eaa6c5 14:43:16.059 [main] INFO o.s.b.a.e.web.EndpointLinksResolver - Exposing 2 endpoint(s) beneath base path '/actuator' 14:43:16.249 [main] INFO o.s.b.w.e.tomcat.TomcatWebServer - Tomcat started on port(s): 8080 (http) with context path '' 14:43:17.283 [Timer-0] ERROR com.zaxxer.hikari.pool.HikariPool - HikariPool-1 - Exception during pool initialization. com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:988) at com.mysql.jdbc.MysqlIO.(MysqlIO.java:341) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2251) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083) at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:806) at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328) at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:136) at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:369) at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:198) at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:467) at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:541) at com.zaxxer.hikari.pool.HikariPool.(HikariPool.java:115) at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:112) at org.springframework.jdbc.datasource.DataSourceUtils.fetchConnection(DataSourceUtils.java:151) at org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(DataSourceUtils.java:115) at org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:78) at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:612) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:862) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:917) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:927) at io.eventuate.local.common.CdcMonitoringDao.lambda$update$0(CdcMonitoringDao.java:36) at io.eventuate.local.common.DaoUtils.handleConnectionLost(DaoUtils.java:22) at io.eventuate.local.common.DaoUtils.handleConnectionLost(DaoUtils.java:51) at io.eventuate.local.common.CdcMonitoringDao.update(CdcMonitoringDao.java:32) at io.eventuate.local.db.log.common.DbLogMetrics$1.run(DbLogMetrics.java:85) at java.util.TimerThread.mainLoop(Timer.java:555) at java.util.TimerThread.run(Timer.java:505) Caused by: java.net.UnknownHostException: mysql: unknown error at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method) at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928) at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323) at java.net.InetAddress.getAllByName0(InetAddress.java:1276) at java.net.InetAddress.getAllByName(InetAddress.java:1192) at java.net.InetAddress.getAllByName(InetAddress.java:1126) at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:188) at com.mysql.jdbc.MysqlIO.(MysqlIO.java:300) ... 33 common frames omitted 14:43:17.285 [Timer-0] ERROR io.eventuate.local.common.DaoUtils - Could not access database Failed to obtain JDBC Connection; nested exception is com. mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. - retrying in 500 mil liseconds org.springframework.jdbc.CannotGetJdbcConnectionException: Failed to obtain JDBC Connection; nested exception is com.mysql.jdbc.exceptions.jdbc4.Commu nicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. at org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:81) at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:612) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:862) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:917) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:927) at io.eventuate.local.common.CdcMonitoringDao.lambda$update$0(CdcMonitoringDao.java:36) at io.eventuate.local.common.DaoUtils.handleConnectionLost(DaoUtils.java:22) at io.eventuate.local.common.DaoUtils.handleConnectionLost(DaoUtils.java:51) at io.eventuate.local.common.CdcMonitoringDao.update(CdcMonitoringDao.java:32) at io.eventuate.local.db.log.common.DbLogMetrics$1.run(DbLogMetrics.java:85) at java.util.TimerThread.mainLoop(Timer.java:555) at java.util.TimerThread.run(Timer.java:505) Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:988) at com.mysql.jdbc.MysqlIO.(MysqlIO.java:341) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2251) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083) at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:806) at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328) at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:136) at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:369) at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:198) at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:467) at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:541) at com.zaxxer.hikari.pool.HikariPool.(HikariPool.java:115) at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:112) at org.springframework.jdbc.datasource.DataSourceUtils.fetchConnection(DataSourceUtils.java:151) at org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(DataSourceUtils.java:115) at org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:78) ... 11 common frames omitted Caused by: java.net.UnknownHostException: mysql: unknown error at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method) at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928) at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323) at java.net.InetAddress.getAllByName0(InetAddress.java:1276) at java.net.InetAddress.getAllByName(InetAddress.java:1192) at java.net.InetAddress.getAllByName(InetAddress.java:1126) at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:188) at com.mysql.jdbc.MysqlIO.(MysqlIO.java:300) ... 33 common frames omitted 14:43:17.449 [Curator-LeaderSelector-0] INFO o.a.k.c.c.i.AbstractCoordinator - Discovered coordinator 192.168.99.100:9092 (id: 2147483647 rack: null) for group c3de8eb3-d26b-4783-9604-4583929bcd1f. 14:43:17.452 [Curator-LeaderSelector-0] INFO o.a.k.c.c.i.ConsumerCoordinator - Revoking previously assigned partitions [] for group c3de8eb3-d26b-478 3-9604-4583929bcd1f 14:43:17.453 [Curator-LeaderSelector-0] INFO o.a.k.c.c.i.AbstractCoordinator - (Re-)joining group c3de8eb3-d26b-4783-9604-4583929bcd1f 14:43:18.790 [Timer-0] ERROR com.zaxxer.hikari.pool.HikariPool - HikariPool-1 - Exception during pool initialization. com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:988) at com.mysql.jdbc.MysqlIO.(MysqlIO.java:341) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2251) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083) at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:806) at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328) at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:136) at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:369) at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:198) at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:467) at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:541) at com.zaxxer.hikari.pool.HikariPool.(HikariPool.java:115) at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:112) at org.springframework.jdbc.datasource.DataSourceUtils.fetchConnection(DataSourceUtils.java:151) at org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(DataSourceUtils.java:115) at org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:78) at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:612) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:862) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:917) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:927) at io.eventuate.local.common.CdcMonitoringDao.lambda$update$0(CdcMonitoringDao.java:36) at io.eventuate.local.common.DaoUtils.handleConnectionLost(DaoUtils.java:22) at io.eventuate.local.common.DaoUtils.handleConnectionLost(DaoUtils.java:51) at io.eventuate.local.common.CdcMonitoringDao.update(CdcMonitoringDao.java:32) at io.eventuate.local.db.log.common.DbLogMetrics$1.run(DbLogMetrics.java:85) at java.util.TimerThread.mainLoop(Timer.java:555) at java.util.TimerThread.run(Timer.java:505) Caused by: java.net.UnknownHostException: mysql at java.net.InetAddress.getAllByName0(InetAddress.java:1280) at java.net.InetAddress.getAllByName(InetAddress.java:1192) at java.net.InetAddress.getAllByName(InetAddress.java:1126) at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:188) at com.mysql.jdbc.MysqlIO.(MysqlIO.java:300) ... 33 common frames omitted 14:43:18.791 [Timer-0] ERROR io.eventuate.local.common.DaoUtils - Could not access database Failed to obtain JDBC Connection; nested exception is com. mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. - retrying in 500 mil liseconds org.springframework.jdbc.CannotGetJdbcConnectionException: Failed to obtain JDBC Connection; nested exception is com.mysql.jdbc.exceptions.jdbc4.Commu nicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. at org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:81) at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:612) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:862) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:917) at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:927) at io.eventuate.local.common.CdcMonitoringDao.lambda$update$0(CdcMonitoringDao.java:36) at io.eventuate.local.common.DaoUtils.handleConnectionLost(DaoUtils.java:22) at io.eventuate.local.common.DaoUtils.handleConnectionLost(DaoUtils.java:51) at io.eventuate.local.common.CdcMonitoringDao.update(CdcMonitoringDao.java:32) at io.eventuate.local.db.log.common.DbLogMetrics$1.run(DbLogMetrics.java:85) at java.util.TimerThread.mainLoop(Timer.java:555) at java.util.TimerThread.run(Timer.java:505) Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:988) at com.mysql.jdbc.MysqlIO.(MysqlIO.java:341) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2251) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083) at com.mysql.jdbc.ConnectionImpl.(ConnectionImpl.java:806) at com.mysql.jdbc.JDBC4Connection.(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:404) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328) at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:136) at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:369) at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:198) at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:467) at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:541) at com.zaxxer.hikari.pool.HikariPool.(HikariPool.java:115) at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:112) at org.springframework.jdbc.datasource.DataSourceUtils.fetchConnection(DataSourceUtils.java:151) at org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(DataSourceUtils.java:115) at org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:78) ... 11 common frames omitted Caused by: java.net.UnknownHostException: mysql at java.net.InetAddress.getAllByName0(InetAddress.java:1280) at java.net.InetAddress.getAllByName(InetAddress.java:1192) at java.net.InetAddress.getAllByName(InetAddress.java:1126) at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:188) at com.mysql.jdbc.MysqlIO.(MysqlIO.java:300) ... 33 common frames omitted 14:43:18.964 [http-nio-8080-exec-1] INFO o.s.web.servlet.DispatcherServlet - Initializing Servlet 'dispatcherServlet' 14:43:18.974 [http-nio-8080-exec-1] INFO o.s.web.servlet.DispatcherServlet - Completed initialization in 10 ms 14:43:19.072 [http-nio-8080-exec-1] INFO o.a.c.f.imps.CuratorFrameworkImpl - Starting 14:43:19.089 [http-nio-8080-exec-1-EventThread] INFO o.a.c.f.state.ConnectionStateManager - State change: CONNECTED 14:43:19.093 [Curator-Framework-0] INFO o.a.c.f.imps.CuratorFrameworkImpl - backgroundOperationsLoop exiting 14:43:19.105 [http-nio-8080-exec-1] INFO o.a.k.c.consumer.ConsumerConfig - ConsumerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor] reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 max.partition.fetch.bytes = 1048576 bootstrap.servers = [kafka:9092] ssl.keystore.type = JKS enable.auto.commit = false sasl.mechanism = GSSAPI interceptor.classes = null exclude.internal.topics = true ssl.truststore.password = null client.id = ssl.endpoint.identification.algorithm = null max.poll.records = 2147483647 check.crcs = true request.timeout.ms = 1000 heartbeat.interval.ms = 100 auto.commit.interval.ms = 5000 receive.buffer.bytes = 65536 ssl.truststore.type = JKS ssl.truststore.location = null ssl.keystore.password = null fetch.min.bytes = 1 send.buffer.bytes = 131072 value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer group.id = 1d6a8421-0f53-4d40-89d3-5341bda0474d retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX ssl.key.password = null fetch.max.wait.ms = 500 sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 session.timeout.ms = 500 metrics.num.samples = 2 key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.keystore.location = null ssl.cipher.suites = null security.protocol = PLAINTEXT

D:\intelliJWorkspaces\manningWorkspace\ftgo-application> -- cdc-service container log ends --

--output of 1. git checkout wip-use-gradle-docker-compose -- Lenovo@PRAKASH-LENOVO MINGW64 /d/intelliJWorkspaces/manningWorkspace/ftgo-application (master) $ git checkout wip-use-gradle-docker-compose error: pathspec 'wip-use-gradle-docker-compose' did not match any file(s) known to git

Lenovo@PRAKASH-LENOVO MINGW64 /d/intelliJWorkspaces/manningWorkspace/ftgo-application (master) $

--output of 2. ./gradlew compileAll assemble -- Lenovo@PRAKASH-LENOVO MINGW64 /d/intelliJWorkspaces/manningWorkspace/ftgo-application (master) $ ./gradlew compileAll assemble Starting a Gradle Daemon (subsequent builds will be faster)

Task :buildSrc:compileJava NO-SOURCE Task :buildSrc:compileGroovy UP-TO-DATE Task :buildSrc:processResources NO-SOURCE Task :buildSrc:classes UP-TO-DATE Task :buildSrc:jar UP-TO-DATE Task :buildSrc:assemble UP-TO-DATE Task :buildSrc:compileTestJava NO-SOURCE Task :buildSrc:compileTestGroovy NO-SOURCE Task :buildSrc:processTestResources NO-SOURCE Task :buildSrc:testClasses UP-TO-DATE Task :buildSrc:test SKIPPED Task :buildSrc:check UP-TO-DATE Task :buildSrc:build UP-TO-DATE

FAILURE: Build failed with an exception.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29s

Lenovo@PRAKASH-LENOVO MINGW64 /d/intelliJWorkspaces/manningWorkspace/ftgo-application (master)

--output of ./gradlew :composeUp -- Lenovo@PRAKASH-LENOVO MINGW64 /d/intelliJWorkspaces/manningWorkspace/ftgo-application (master) $ ./gradlew :composeUp

Task :buildSrc:compileJava NO-SOURCE Task :buildSrc:compileGroovy UP-TO-DATE Task :buildSrc:processResources NO-SOURCE Task :buildSrc:classes UP-TO-DATE Task :buildSrc:jar UP-TO-DATE Task :buildSrc:assemble UP-TO-DATE Task :buildSrc:compileTestJava NO-SOURCE Task :buildSrc:compileTestGroovy NO-SOURCE Task :buildSrc:processTestResources NO-SOURCE Task :buildSrc:testClasses UP-TO-DATE Task :buildSrc:test SKIPPED Task :buildSrc:check UP-TO-DATE Task :buildSrc:build UP-TO-DATE

FAILURE: Build failed with an exception.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1s

Lenovo@PRAKASH-LENOVO MINGW64 /d/intelliJWorkspaces/manningWorkspace/ftgo-application (master) $

Please reply.

Thanks Prakash S.

cer commented 4 years ago

Please look carefully for errors The checkout failed: error: pathspec 'wip-use-gradle-docker-compose' did not match any file(s) known to git

git checkout wip-use-gradle-docker-compose --
Lenovo@PRAKASH-LENOVO MINGW64 /d/intelliJWorkspaces/manningWorkspace/ftgo-application (master)
$ git checkout wip-use-gradle-docker-compose
error: pathspec 'wip-use-gradle-docker-compose' did not match any file(s) known to git

Lenovo@PRAKASH-LENOVO MINGW64 /d/intelliJWorkspaces/manningWorkspace/ftgo-application (master)
prakashid2 commented 4 years ago

I can see that there is no file named wip-use-gradle-docker-compose under ftgo-application folder on github. So will the command 'git checkout' ever succeed?

And what about the other two commands('gradlew compileAll assemble' and 'gradlew :composeUp') you have provided? These are also failing. You did not comment on these.

And I've provided the container log of cdc-service. In which there is an exception trace 14:43:17.285 [Timer-0] ERROR io.eventuate.local.common.DaoUtils - Could not access database Failed to obtain JDBC Connection; nested exception is com. mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure Caused by: java.net.UnknownHostException: mysql: unknown error

Let me know how can I proceed further.

cer commented 4 years ago

See this branch https://github.com/microservices-patterns/ftgo-application/tree/wip-use-gradle-docker-compose

You need to do a

  1. git fetch
  2. git checkout wip-use-gradle-docker-compose

These commands are failing because the first command to checkout the branch that implements these commands failed:

And what about the other two commands('gradlew compileAll assemble' and 'gradlew :composeUp') you have provided? These are also failing. You did not comment on these.
cer commented 4 years ago

regarding

mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
Caused by: java.net.UnknownHostException: mysql: unknown error

For some reason, mysql is not running. Also, The mysql container is not showing in the output of docker ps . For some reason, it has crashed.

It would be helpful if any logs in the issue were links to gists (https://gist.github.com/). This issue has grown so long that its difficult to read.

prakashid2 commented 4 years ago

Ok. I'm now working on the mentioned branch(wip-use-gradle-docker-compose) instead of master. After giving following commands

gradlew buildContracts gradlew compileAll assemble gradlew :composeUp This tries to bring up all the services into healthy state one by one but observed that it's taking to long(more than 10 mins and yet not completed). I got some meaningful info in consumer-service containers log where I had stopped(Ctrl+C while running gradlew :composeUp) as it is found unhealthy even after waiting for 10 mins. It shows SQLExcpetion due to access denied to ftgo-consumer-service-user


java.sql.SQLException: Access denied for user 'ftgo_consumer_service_user'@'172.19.0.14' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:965) ~[mysql-connector-java-5.1.46.jar!/:5.1.46]


The full log attached in the gist.github.com

This contains consumer-service containers log as well as mysql containers log. I noticed mysql containers log showing a Warning:


Warning] Failed to set up SSL because of the following SSL library error: SSL context is not usable without certificate and private key


Could 'not having SSL certificate' be the reason of access denied to all ftgo services users? Please reply.

Earlier I did not know the gist feature for posting logs hence included online. Sorry for that.

Tried to ping 172.19.0.14. No reply got. Why so?


Ping statistics for 172.19.0.14: Packets: Sent = 4, Received = 0, Lost = 4 (100% loss),


Thanks Prakash S.

prakashid2 commented 4 years ago

Hey Chris.

I've ran docker-compose up -d kafka docker-compose up -d cdc-service docker-compose up -d ftgo-order-service

and then after 10 mins I'm seeing ftgo-oder-service exited but mysql service is still up. I've provided the output of 'docker ps' and 'docker ps -a' along with new log of mysql container in the same gist.github.com

According to new log obtained on 2020-04-24 the 'docker ps' shows below line in the output


2154ee5c3d4b ftgo-application_mysql "docker-entrypoint.s" 2 days ago Up 13 minutes 0. 0.0.0:3306->3306/tcp ftgo-application_mysql_1


And at the time of actually writing this it is showing mysql service up for past 32 mins.

Added mysql containers log in the same gist. This has below lines in the log


2020-04-24T13:27:22.398364Z 0 [Note] InnoDB: Buffer pool(s) load completed at 200424 13:27:22 2020-04-24T13:27:22.421534Z 0 [Note] Recovering after a crash using mysql-bin 2020-04-24T13:27:22.518679Z 0 [Note] Starting crash recovery... 2020-04-24T13:27:22.518735Z 0 [Note] Crash recovery finished. 2020-04-24T13:27:22.569098Z 0 [Warning] Failed to set up SSL because of the following SSL library error: SSL context is not usable without certificate and private key


This means yes mysql was crashed but Crash recovery was also done. I do not know why it was crashed. Probably you can suggest something.

At the time of writing this cdc-service and mysql service both are showing in 'docker ps' but not 'order-service'

Thanks Prakash S.

cer commented 4 years ago

Warning] Failed to set up SSL because of the following SSL library error: SSL context is not usable without certificate and private key

Could 'not having SSL certificate' be the reason of access denied to all ftgo services users? Please reply.

No.

Earlier I did not know the gist feature for posting logs hence included online. Sorry for that.

Tried to ping 172.19.0.14. No reply got. Why so?

Docker networking probably doesn''t allow that.

Sadly, I'm not sure why it's not working on your machine.

cer commented 4 years ago

Please just use the wip-use-gradle-docker-compose branch to debug this problem.

First, run this:

./gradlew :composeUp -P startedService=mysql

Assuming that the directory is ftgo-application run this command:

docker exec -it ftgo-application_mysql_1 bash

you should get a # prompt

Run this command

mysql -hlocalhost -uroot -prootpassword

you should get a mysql> prompt

Run this command

show databases;

This is what I get:

mysql> show databases;
+-------------------------+
| Database                |
+-------------------------+
| information_schema      |
| eventuate               |
| ftgo_accounting_service |
| ftgo_consumer_service   |
| ftgo_kitchen_service    |
| ftgo_order_service      |
| ftgo_restaurant_service |
| mysql                   |
| performance_schema      |
| sys                     |
+-------------------------+
10 rows in set (0.00 sec)
prakashid2 commented 4 years ago

Chris.

Yes. I'm working on the mentioned branch only. The output of very first command says 'Connection Refused'.


D:\intelliJWorkspaces\manningWorkspace\wip-use-gradle-docker-compose-Branch\ftgo-application>gradlew :composeUp -P startedService=mysql

Task :composeUp Building mysql Creating network "ftgo-application_default" with the default driver Creating ftgo-application_mysql_1 ... done <-------------> 0% EXECUTING [8s] DOCKER_HOST environment variable detected - will be used as hostname of service mysql (192.168.99.100)' Probing TCP socket on 192.168.99.100:3306 of service 'mysql_1' Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) Waiting for TCP socket on 192.168.99.100:3306 of service 'mysql_1' (Connection refused: connect) <-------------> 0% EXECUTING [1m 11s] :composeUp Terminate batch job (Y/N)? y


I've mysql server installed on my PC. Do I need to run mysqld before I run gradlew :composeUp?

Thanks Prakash S.

cer commented 4 years ago

The example is starting its own copy of MySQL as a Docker container.

gradlew :composeUp -P startedService=mysql can't connect.

I assume that 192.168.99.100 is still the IP address of your Docker Toolbox VM.

What's the output of docker ps -a and docker logs ftgo-application_mysql_1

prakashid2 commented 4 years ago

-- output shows my Docker ToolBox is still using 192.168.99.100 as DOCKER_HOST_IP. Also the DOCKER_HOST_IP is verified by the following command -- 'docker run -p 8889:8888 -e DOCKER_DIAGNOSTICS_PORT=8889 -e DOCKER_HOST_IP --rm eventuateio/eventuateio-docker-networking-diagnostics:0.2.0.RELEASE' This displays following line on console Making HTTP request to self via url= http://192.168.99.100:8889 SUCCESSS!!!!


But the problem continues. Connection still refused. Could this be a problem because older version (5.7.x) of mysql server? Or could it be JDBC driver problem? I do not know but thought of confirming as existing 'mysql-8.0.13-winx64' version running perfectly without any problems on my PC(I did not test it any microservices app or docker). But I feel 5.7.x should also work fine.


The command could not bring up mysql service up. 'gradlew :composeUp -P startedService=mysql ' Teminated using Ctrl+C. and then 'docker ps -a' shows mysql service exited. New log of mysql container added in same link gist

Thanks Prakash S.

cer commented 4 years ago

Regarding

But the problem continues. Connection still refused.
Could this be a problem because older version (5.7.x) of mysql server? Or could it be JDBC driver problem? I do not know but thought of confirming as existing 'mysql-8.0.13-winx64' version running perfectly without any problems on my PC(I did not test it any microservices app or docker). But I feel 5.7.x should also work fine.

What's running on your machine is quite separate that what's running in Docker containers. What's more the FTGO example application runs without issue on numerous machine. The challenge is figuring out why it is not working for you.

cer commented 4 years ago

This looks like the problem - see end of MySQL log:

usr/local/bin/docker-entrypoint.sh: running /docker-entrypoint-initdb.d/4.compile-schema-per-service.sh
/docker-entrypoint-initdb.d/4.compile-schema-per-service.sh: line 2: $'\r': command not found

The MySQL server is failing to start because of what looks like a Windows vs. Linux line ending problem.

I think what has happened is that git clone checked out files with line endings suitable for your windows machine. However, this is a problem for mysql/*.sh since they execute in a Linux Docker container.

Can you try converting the line ends of those files to Linux-style newline. https://stackoverflow.com/questions/20368781/anything-like-dos2unix-for-windows

cer commented 4 years ago

I just pushed a change to the wip-use-gradle-docker-compose branch

  1. git pull
  2. git add --renormalize mysql/*.sh dynamodblocal-init/*.sh

I think the git add --renormalize should fix the files.

Please try running MySql again: https://github.com/microservices-patterns/ftgo-application/issues/85#issuecomment-619120279

prakashid2 commented 4 years ago

Chris.

As I know you've pushed the necessary change into 'wip-use-gradle-docker-compose' branch. I just took latest of the mentioned branch using following command


git clone -b wip-use-gradle-docker-compose --single-branch https://github.com/microservices-patterns/ftgo-application.git


and did the build process again and finally did composeUp on mysql service and I could see that I can login to mysql using # prompt and see databases that you have mentioned. Finally I could bring all docker processes up using gradlew: composeUp and I can see the swagger UIs now. Thanks alot for helping the example project run successfully on my PC. I'm quickly summarizing below the list of commands that I used for seeing the ftgo-application running. This may be helpful for others.

  1. git clone -b wip-use-gradle-docker-compose --single-branch https://github.com/microservices-patterns/ftgo-application.git
  2. cd ftgo-application
  3. gradlew buildContracts
  4. gradlew compileAll assemble
  5. docker-machine start
  6. docker-machine env
  7. set DOCKER_HOST_IP=192.168.99.100 [The developer need to use the IP assigned to his Docker ToolBox]
  8. docker run -p 8889:8888 -e DOCKER_DIAGNOSTICS_PORT=8889 -e DOCKER_HOST_IP --rm eventuateio/eventuateio-docker-networking-diagnostics:0.2.0.RELEASE This command will verify whether DOCKER_HOST_IP is correct
  9. gradlew :composeUp -P startedService=mysql This command brings up mysql service. Verify using 'docker ps' and 'docker ps -a' Also check mysql continers log using the command. 'docker logs '
  10. If you're here this means mysql container is up and running and now you want to bring all the services up. Use following gradlew :composeUp

-- sample output of 'gradlew :composeUp' -- D:\intelliJWorkspaces\manningWorkspace\wip-use-gradle-docker-compose-Branch\ftgo-application>gradlew :composeUp Starting a Gradle Daemon, 1 stopped Daemon could not be reused, use --status for details

Task :composeUp zookeeper uses an image, skipping kafka uses an image, skipping cdc-service uses an image, skipping zipkin uses an image, skipping Building mysql Building ftgo-consumer-service Building ftgo-kitchen-service Building ftgo-restaurant-service Building ftgo-accounting-service Building ftgo-api-gateway Building ftgo-order-service Building dynamodblocal Building dynamodblocal-init Building ftgo-order-history-service Creating network "ftgo-application_default" with the default driver Creating ftgo-application_ftgo-api-gateway_1 ... done Creating ftgo-application_mysql_1 ... done Creating ftgo-application_zookeeper_1 ... done Creating ftgo-application_dynamodblocal_1 ... done Creating ftgo-application_zipkin_1 ... done C Creating ftgo-application_kafka_1 ... done Creating ftgo-application_dynamodblocal-init_1 ... done Creating ftgo-application_cdc-service_1 ... done Creating ftgo-application_ftgo-accounting-service_1 ... done Creating ftgo-application_ftgo-order-service_1 ... done Creating ftgo-application_ftgo-restaurant-service_1 ... done Creating ftgo-application_ftgo-consumer-service_1 ... done Creating ftgo-application_ftgo-kitchen-service_1 ... done Creating ftgo-application_ftgo-order-history-service_1 ... done D OCKER_HOST environment variable detected - will be used as hostname of service zookeeper (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service kafka (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service mysql (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service cdc-service (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service ftgo-consumer-service (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service ftgo-kitchen-service (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service ftgo-restaurant-service (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service ftgo-accounting-service (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service ftgo-api-gateway (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service zipkin (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service ftgo-order-service (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service dynamodblocal (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service dynamodblocal-init (192.168.99.100)' DOCKER_HOST environment variable detected - will be used as hostname of service ftgo-order-history-service (192.168.99.100)' Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) Waiting for cdc-service_1 to become healthy (it's unhealthy) cdc-service_1 health state reported as 'healthy' - continuing... Waiting for ftgo-consumer-service_1 to become healthy (it's unhealthy) Waiting for ftgo-consumer-service_1 to become healthy (it's unhealthy) Waiting for ftgo-consumer-service_1 to become healthy (it's unhealthy) Waiting for ftgo-consumer-service_1 to become healthy (it's unhealthy) Waiting for ftgo-consumer-service_1 to become healthy (it's unhealthy) Waiting for ftgo-consumer-service_1 to become healthy (it's unhealthy) Waiting for ftgo-consumer-service_1 to become healthy (it's unhealthy) Waiting for ftgo-consumer-service_1 to become healthy (it's unhealthy) Waiting for ftgo-consumer-service_1 to become healthy (it's unhealthy) Waiting for ftgo-consumer-service_1 to become healthy (it's unhealthy) Waiting for ftgo-consumer-service_1 to become healthy (it's unhealthy) ftgo-consumer-service_1 health state reported as 'healthy' - continuing... Waiting for ftgo-kitchen-service_1 to become healthy (it's unhealthy) ftgo-kitchen-service_1 health state reported as 'healthy' - continuing... ftgo-restaurant-service_1 health state reported as 'healthy' - continuing... ftgo-accounting-service_1 health state reported as 'healthy' - continuing... ftgo-api-gateway_1 health state reported as 'healthy' - continuing... Waiting for ftgo-order-service_1 to become healthy (it's unhealthy) Waiting for ftgo-order-service_1 to become healthy (it's unhealthy) ftgo-order-service_1 health state reported as 'healthy' - continuing... dynamodblocal_1 health state reported as 'healthy' - continuing... dynamodblocal-init_1 health state reported as 'healthy' - continuing... ftgo-order-history-service_1 health state reported as 'healthy' - continuing... Probing TCP socket on 192.168.99.100:2181 of service 'zookeeper_1' TCP socket on 192.168.99.100:2181 of service 'zookeeper_1' is ready Probing TCP socket on 192.168.99.100:9092 of service 'kafka_1' TCP socket on 192.168.99.100:9092 of service 'kafka_1' is ready Probing TCP socket on 192.168.99.100:3306 of service 'mysql_1' TCP socket on 192.168.99.100:3306 of service 'mysql_1' is ready Probing TCP socket on 192.168.99.100:8099 of service 'cdc-service_1' TCP socket on 192.168.99.100:8099 of service 'cdc-service_1' is ready Probing TCP socket on 192.168.99.100:8081 of service 'ftgo-consumer-service_1' TCP socket on 192.168.99.100:8081 of service 'ftgo-consumer-service_1' is ready Probing TCP socket on 192.168.99.100:8083 of service 'ftgo-kitchen-service_1' TCP socket on 192.168.99.100:8083 of service 'ftgo-kitchen-service_1' is ready Probing TCP socket on 192.168.99.100:8084 of service 'ftgo-restaurant-service_1' TCP socket on 192.168.99.100:8084 of service 'ftgo-restaurant-service_1' is ready Probing TCP socket on 192.168.99.100:8085 of service 'ftgo-accounting-service_1' TCP socket on 192.168.99.100:8085 of service 'ftgo-accounting-service_1' is ready Probing TCP socket on 192.168.99.100:8087 of service 'ftgo-api-gateway_1' TCP socket on 192.168.99.100:8087 of service 'ftgo-api-gateway_1' is ready Probing TCP socket on 192.168.99.100:9411 of service 'zipkin_1' TCP socket on 192.168.99.100:9411 of service 'zipkin_1' is ready Probing TCP socket on 192.168.99.100:8082 of service 'ftgo-order-service_1' TCP socket on 192.168.99.100:8082 of service 'ftgo-order-service_1' is ready Probing TCP socket on 192.168.99.100:8000 of service 'dynamodblocal_1' TCP socket on 192.168.99.100:8000 of service 'dynamodblocal_1' is ready Probing TCP socket on 192.168.99.100:8086 of service 'ftgo-order-history-service_1' TCP socket on 192.168.99.100:8086 of service 'ftgo-order-history-service_1' is ready

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 5m 0s 1 actionable task: 1 executed D:\intelliJWorkspaces\manningWorkspace\wip-use-gradle-docker-compose-Branch\ftgo-application>


  1. I can see the swagger UIs(screens-shot attached) and api gateway link(http://192.168.99.100:8087/) accessible

consumerSwaggerUi orderHistorySwaggerUi orderSwaggerUi restaurantSwaggerUi

Thanking you. Prakash S. Mumbai, India

prakashid2 commented 4 years ago

Chris.

One last question. How to specify price of menu item in 'Create Restaurant' -- Please see if following Request body is proper -- { "menu": { "menuItems": [ { "id": "101S", "name": "Burger", "price": {10} } ] }, "name": "Prashant Snacks Corner" }


I'm getting following Error related to 'net.chrisrichardson.ftgo.common.Money' object.


{ "timestamp": 1587825240203, "status": 400, "error": "Bad Request", "message": "JSON parse error: Cannot deserialize instance of net.chrisrichardson.ftgo.common.Money out of START_OBJECT token; nested exception is com.fasterxml.jackson.databind.JsonMappingException: Cannot deserialize instance of net.chrisrichardson.ftgo.common.Money out of START_OBJECT token\n at [Source: (PushbackInputStream); line: 7, column: 18] (through reference chain: net.chrisrichardson.ftgo.restaurantservice.events.CreateRestaurantRequest[\"menu\"]->net.chrisrichardson.ftgo.restaurantservice.events.RestaurantMenu[\"menuItems\"]->java.util.ArrayList[0]->net.chrisrichardson.ftgo.restaurantservice.events.MenuItem[\"price\"])", "path": "/restaurants" }


Thanks Prakash S.

prakashid2 commented 4 years ago

Forgotten to attach screen shot. Attaching now

ErrorInCreateRestaurant-CannotDeserialize-net chrisrichardson ftgo common Money

cer commented 4 years ago
{
  "menu": {
    "menuItems": [
      {
        "id": "1",
        "name": "Chicken Tika",
        "price": "10.00"
      }
    ]
  },
  "name": "Ajanta"
}
prakashid2 commented 4 years ago

Chris.

Above request body worked fine. I'm now trying another service that's 'Revise Order'. What could be a sample request body for same? What can I specify in place of 'additionalProp1', 'additionalProp2' etc.? Does this operation revises total order price?


{ "revisedLineItemQuantities": { "additionalProp1": 0, "additionalProp2": 0, "additionalProp3": 0 } }


Thanks Prakash S.

cer commented 4 years ago

I'll investigate but you have access to the source code!

e.g. https://github.com/microservices-patterns/ftgo-application/blob/eb367276256e266364906079959f884c2a666b01/ftgo-order-service/src/main/java/net/chrisrichardson/ftgo/orderservice/web/OrderController.java#L36