apache / incubator-seata

:fire: Seata is an easy-to-use, high-performance, open source distributed transaction solution.
https://seata.apache.org/
Apache License 2.0
25.33k stars 8.79k forks source link

0304 register RM failed io.seata.common.exception.FrameworkException: connect failed, can not connect to services-server client端无法启动 #2107

Closed yudar1024 closed 4 years ago

yudar1024 commented 4 years ago

Ⅰ. Issue Description

client 无法启动

Ⅱ. Describe what happened

If there is an exception, please attach the exception trace:

2019-12-29 23:34:31.031  INFO 29548 --- [imeoutChecker_1] i.s.c.r.netty.NettyClientChannelManager  : will connect to 192.168.2.127:8091
2019-12-29 23:34:31.032  INFO 29548 --- [imeoutChecker_1] i.s.core.rpc.netty.NettyPoolableFactory  : NettyPool create channel to transactionRole:TMROLE,address:192.168.2.127:8091,msg:< RegisterTMRequest{applicationId='sprcloudaliorder', transactionServiceGroup='order-service-group'} >
2019-12-29 23:34:31.035  WARN 29548 --- [imeoutChecker_1] io.netty.channel.AbstractChannel         : Force-closing a channel whose registration task was not accepted by an event loop: [id: 0x1adf7db4]

java.util.concurrent.RejectedExecutionException: event executor terminated
    at io.netty.util.concurrent.SingleThreadEventExecutor.reject(SingleThreadEventExecutor.java:987)
    at io.netty.util.concurrent.SingleThreadEventExecutor.offerTask(SingleThreadEventExecutor.java:388)
    at io.netty.util.concurrent.SingleThreadEventExecutor.addTask(SingleThreadEventExecutor.java:381)
    at io.netty.util.concurrent.SingleThreadEventExecutor.execute(SingleThreadEventExecutor.java:886)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.register(AbstractChannel.java:472)
    at io.netty.channel.SingleThreadEventLoop.register(SingleThreadEventLoop.java:87)
    at io.netty.channel.SingleThreadEventLoop.register(SingleThreadEventLoop.java:81)
    at io.netty.channel.MultithreadEventLoopGroup.register(MultithreadEventLoopGroup.java:86)
    at io.netty.bootstrap.AbstractBootstrap.initAndRegister(AbstractBootstrap.java:311)
    at io.netty.bootstrap.Bootstrap.doResolveAndConnect(Bootstrap.java:157)
    at io.netty.bootstrap.Bootstrap.connect(Bootstrap.java:141)
    at io.seata.core.rpc.netty.RpcClientBootstrap.getNewChannel(RpcClientBootstrap.java:184)
    at io.seata.core.rpc.netty.NettyPoolableFactory.makeObject(NettyPoolableFactory.java:60)
    at io.seata.core.rpc.netty.NettyPoolableFactory.makeObject(NettyPoolableFactory.java:35)
    at org.apache.commons.pool.impl.GenericKeyedObjectPool.borrowObject(GenericKeyedObjectPool.java:1220)
    at io.seata.core.rpc.netty.NettyClientChannelManager.doConnect(NettyClientChannelManager.java:202)
    at io.seata.core.rpc.netty.NettyClientChannelManager.acquireChannel(NettyClientChannelManager.java:102)
    at io.seata.core.rpc.netty.NettyClientChannelManager.reconnect(NettyClientChannelManager.java:171)
    at io.seata.core.rpc.netty.AbstractRpcRemotingClient$1.run(AbstractRpcRemotingClient.java:113)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:834)

2019-12-29 23:34:31.038 ERROR 29548 --- [imeoutChecker_1] i.s.c.r.netty.NettyClientChannelManager  : 0304 register RM failed.

io.seata.common.exception.FrameworkException: can not connect to services-server.
    at io.seata.core.rpc.netty.RpcClientBootstrap.getNewChannel(RpcClientBootstrap.java:195)
    at io.seata.core.rpc.netty.NettyPoolableFactory.makeObject(NettyPoolableFactory.java:60)
    at io.seata.core.rpc.netty.NettyPoolableFactory.makeObject(NettyPoolableFactory.java:35)
    at org.apache.commons.pool.impl.GenericKeyedObjectPool.borrowObject(GenericKeyedObjectPool.java:1220)
    at io.seata.core.rpc.netty.NettyClientChannelManager.doConnect(NettyClientChannelManager.java:202)
    at io.seata.core.rpc.netty.NettyClientChannelManager.acquireChannel(NettyClientChannelManager.java:102)
    at io.seata.core.rpc.netty.NettyClientChannelManager.reconnect(NettyClientChannelManager.java:171)
    at io.seata.core.rpc.netty.AbstractRpcRemotingClient$1.run(AbstractRpcRemotingClient.java:113)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: io.seata.common.exception.FrameworkException: connect failed, can not connect to services-server.
    at io.seata.core.rpc.netty.RpcClientBootstrap.getNewChannel(RpcClientBootstrap.java:190)
    ... 14 common frames omitted
Caused by: java.util.concurrent.RejectedExecutionException: event executor terminated
    at io.netty.util.concurrent.SingleThreadEventExecutor.reject(SingleThreadEventExecutor.java:987)
    at io.netty.util.concurrent.SingleThreadEventExecutor.offerTask(SingleThreadEventExecutor.java:388)
    at io.netty.util.concurrent.SingleThreadEventExecutor.addTask(SingleThreadEventExecutor.java:381)
    at io.netty.util.concurrent.SingleThreadEventExecutor.execute(SingleThreadEventExecutor.java:886)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.register(AbstractChannel.java:472)
    at io.netty.channel.SingleThreadEventLoop.register(SingleThreadEventLoop.java:87)
    at io.netty.channel.SingleThreadEventLoop.register(SingleThreadEventLoop.java:81)
    at io.netty.channel.MultithreadEventLoopGroup.register(MultithreadEventLoopGroup.java:86)
    at io.netty.bootstrap.AbstractBootstrap.initAndRegister(AbstractBootstrap.java:311)
    at io.netty.bootstrap.Bootstrap.doResolveAndConnect(Bootstrap.java:157)
    at io.netty.bootstrap.Bootstrap.connect(Bootstrap.java:141)
    at io.seata.core.rpc.netty.RpcClientBootstrap.getNewChannel(RpcClientBootstrap.java:184)
    ... 14 common frames omitted

2019-12-29 23:34:31.038 ERROR 29548 --- [imeoutChecker_1] i.s.c.r.netty.NettyClientChannelManager  : 0101 can not connect to 192.168.2.127:8091 cause:can not register RM,err:can not connect to services-server.

io.seata.common.exception.FrameworkException: can not register RM,err:can not connect to services-server.
    at io.seata.core.rpc.netty.NettyClientChannelManager.doConnect(NettyClientChannelManager.java:206)
    at io.seata.core.rpc.netty.NettyClientChannelManager.acquireChannel(NettyClientChannelManager.java:102)
    at io.seata.core.rpc.netty.NettyClientChannelManager.reconnect(NettyClientChannelManager.java:171)
    at io.seata.core.rpc.netty.AbstractRpcRemotingClient$1.run(AbstractRpcRemotingClient.java:113)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    at java.base/java.lang.Thread.run(Thread.java:834)

server 端 registry.conf

registry {
  # file 、nacos 、eureka、redis、zk、consul、etcd3、sofa
  type = "nacos"

  nacos {
    serverAddr = "localhost:8848"
    namespace = ""
    cluster = "default"
  }
}

config {
  # file、nacos 、apollo、zk、consul、etcd3
  type = "nacos"

  nacos {
    serverAddr = "localhost::8848"
    namespace = ""
    cluster = "default"
  }
}

client 端 registry.conf

registry {
  # file 、nacos 、eureka、redis、zk、consul、etcd3、sofa
  type = "nacos"

  nacos {
    serverAddr = "localhost:8848"
    namespace = ""
    cluster = "default"
  }
  # eureka {
  #   serviceUrl = "http://localhost:8761/eureka"
  #   application = "default"
  #   weight = "1"
  # }
  # redis {
  #   serverAddr = "localhost:6379"
  #   db = "0"
  # }
  # zk {
  #   cluster = "default"
  #   serverAddr = "127.0.0.1:2181"
  #   session.timeout = 6000
  #   connect.timeout = 2000
  # }
  # consul {
  #   cluster = "default"
  #   serverAddr = "127.0.0.1:8500"
  # }
  # etcd3 {
  #   cluster = "default"
  #   serverAddr = "http://localhost:2379"
  # }
  # sofa {
  #   serverAddr = "127.0.0.1:9603"
  #   application = "default"
  #   region = "DEFAULT_ZONE"
  #   datacenter = "DefaultDataCenter"
  #   cluster = "default"
  #   group = "SEATA_GROUP"
  #   addressWaitTime = "3000"
  # }
  # file {
  #   name = "file.conf"
  # }
}

config {
  # file、nacos 、apollo、zk、consul、etcd3
  type = "nacos"

  nacos {
    serverAddr = "localhost:8848"
    namespace = ""
    cluster="default"
  }
  # consul {
  #   serverAddr = "127.0.0.1:8500"
  # }
  # apollo {
  #   app.id = "seata-server"
  #   apollo.meta = "http://192.168.1.204:8801"
  # }
  # zk {
  #   serverAddr = "127.0.0.1:2181"
  #   session.timeout = 6000
  #   connect.timeout = 2000
  # }
  # etcd3 {
  #   serverAddr = "http://localhost:2379"
  # }
  # file {
  #   name = "file.conf"
  # }
}

config.txt 导入到nacos中

transport.type=TCP
transport.server=NIO
transport.heartbeat=true
transport.enable-client-batch-send-request=false
transport.thread-factory.boss-thread-prefix=NettyBoss
transport.thread-factory.worker-thread-prefix=NettyServerNIOWorker
transport.thread-factory.server-executor-thread-prefix=NettyServerBizHandler
transport.thread-factory.share-boss-worker=false
transport.thread-factory.client-selector-thread-prefix=NettyClientSelector
transport.thread-factory.client-selector-thread-size=1
transport.thread-factory.client-worker-thread-prefix=NettyClientWorkerThread
transport.thread-factory.boss-thread-size=1
transport.thread-factory.worker-thread-size=8
transport.shutdown.wait=3
service.vgroup_mapping.order-service-group=default
service.vgroup_mapping.storage-service-group=default
service.default.grouplist=127.0.0.1:8091
service.enableDegrade=false
service.disableGlobalTransaction=false
client.rm.async.commit.buffer.limit=10000
client.rm.lock.retry.internal=10
client.rm.lock.retry.times=30
client.rm.report.retry.count=5
client.rm.lock.retry.policy.branch-rollback-on-conflict=true
client.rm.table.meta.check.enable=false
client.rm.report.success.enable=true
client.tm.commit.retry.count=5
client.tm.rollback.retry.count=5
store.mode=file
store.file.dir=file_store/data
store.file.max-branch-session-size=16384
store.file.max-global-session-size=512
store.file.file-write-buffer-cache-size=16384
store.file.flush-disk-mode=async
store.file.session.reload.read_size=100
store.db.datasource=dbcp
store.db.db-type=mysql
store.db.driver-class-name=com.mysql.jdbc.Driver
store.db.url=jdbc:mysql://127.0.0.1:3306/seata?useUnicode=true
store.db.user=mysql
store.db.password=openstack
store.db.min-conn=1
store.db.max-conn=3
store.db.global.table=global_table
store.db.branch.table=branch_table
store.db.query-limit=100
store.db.lock-table=lock_table
server.recovery.committing-retry-period=1000
server.recovery.asyn-committing-retry-period=1000
server.recovery.rollbacking-retry-period=1000
server.recovery.timeout-retry-period=1000
server.max.commit.retry.timeout=-1
server.max.rollback.retry.timeout=-1
client.undo.data.validation=true
client.undo.log.serialization=jackson
server.undo.log.save.days=7
server.undo.log.delete.period=86400000
client.undo.log.table=undo_log
client.log.exceptionRate=100
transport.serialization=seata
transport.compressor=none
metrics.enabled=false
metrics.registry-type=compact
metrics.exporter-list=prometheus
metrics.exporter-prometheus-port=9898
client.support.spring.datasource.autoproxy=false

client 端application.yaml

spring:
  devtools:
    restart:
      enabled: false
    livereload:
      enabled: false
  datasource:
    druid:
      #连接信息
#      url: jdbc:p6spy:mysql://localhost:3306/seata_order
      url: jdbc:mysql://localhost:3306/seata_order?characterEncoding=utf8&connectTimeout=1000&socketTimeout=3000&autoReconnect=true&useUnicode=true&useSSL=false&serverTimezone=UTC&allowMultiQueries=true
      username: root
      password: openstack
      #     for log sql , do not use in product env
#      driver-class-name: com.p6spy.engine.spy.P6SpyDriver
      driver-class-name: com.mysql.cj.jdbc.Driver
      #连接池配置
      min-idle: 5
      initial-size: 5
      max-active: 20
      # 配置获取连接等待超时的时间
      max-wait: 60000
      # 配置间隔多久才进行一次检测,检测需要关闭的空闲连接,单位是毫秒
      time-between-eviction-runs-millis: 60000
      # 配置一个连接在池中最小生存的时间,单位是毫秒
      min-evictable-idle-time-millis: 30000
      validation-query: SELECT 1 FROM DUAL
      test-while-idle: true
      test-on-borrow: false
      test-on-return: false
      #监控
      filter:
        wall:
          #          生产用true
          enabled: false
  liquibase:
    contexts: prod
    enable: false

  mail:
    host: localhost
    port: 25
    username:
    password:
  thymeleaf:
    cache: true
  sleuth:
    sampler:
      probability: 1 # report 100% of traces
  zipkin: # Use the "zipkin" Maven profile to have the Spring Cloud Zipkin dependencies
    base-url: http://localhost:9411
    enabled: false
    locator:
      discovery:
        enabled: true
  cloud:
    alibaba:
      seata:
        tx-service-group: order-service-group 

Ⅲ. Describe what you expected to happen

Ⅳ. How to reproduce it (as minimally and precisely as possible)

  1. server端使用 -h 192.168.2.127 启动
  2. client IDE里启动
  3. xxx

Ⅴ. Anything else we need to know?

Ⅵ. Environment:

yudar1024 commented 4 years ago

seata版本是1.0

zjinlei commented 4 years ago

can not connect to 192.168.2.127:8091 http://seata.io/zh-cn/docs/overview/faq.html#7

DragonZru commented 2 months ago

service.default.grouplist=127.0.0.1:8091 换成公网or内网 ip;若docker 部署seata-server 的话指定 SEATA_IP=192.168.xx.xx