apache / apisix

The Cloud-Native API Gateway
https://apisix.apache.org/blog/
Apache License 2.0
14.58k stars 2.52k forks source link

help request: After restarting, the Apisix routing configuration does not take effect #9847

Closed ShuLian1984 closed 1 year ago

ShuLian1984 commented 1 year ago

After restarting, the routing rules do not take effect on a properly running apixis container, and the routing configuration information is not lost. It can be obtained through the admin API and no error messages were found in the logs.

I Need to resubmit for each route rule to take effect

CENTOS_MANTISBT_PROJECT="CentOS-7" CENTOS_MANTISBT_PROJECT_VERSION="7" REDHAT_SUPPORT_PRODUCT="centos" REDHAT_SUPPORT_PRODUCT_VERSION="7"


- OpenResty / Nginx version (run `openresty -V` or `nginx -V`):

nginx version: openresty/1.21.4.1 built by gcc 10.2.1 20210110 (Debian 10.2.1-6) built with OpenSSL 1.1.1s 1 Nov 2022 TLS SNI support enabled configure arguments: --prefix=/usr/local/openresty/nginx --with-cc-opt='-O2 -DAPISIX_BASE_VER=1.21.4.1.8 -DNGX_GRPC_CLI_ENGINE_PATH=/usr/local/openresty/libgrpc_engine.so -DNGX_HTTP_GRPC_CLI_ENGINE_PATH=/usr/local/openresty/libgrpc_engine.so -DNGX_LUA_ABORT_AT_PANIC -I/usr/local/openresty/zlib/include -I/usr/local/openresty/pcre/include -I/usr/local/openresty/openssl111/include' --add-module=../ngx_devel_kit-0.3.1 --add-module=../echo-nginx-module-0.62 --add-module=../xss-nginx-module-0.06 --add-module=../ngx_coolkit-0.2 --add-module=../set-misc-nginx-module-0.33 --add-module=../form-input-nginx-module-0.12 --add-module=../encrypted-session-nginx-module-0.09 --add-module=../srcache-nginx-module-0.32 --add-module=../ngx_lua-0.10.21 --add-module=../ngx_lua_upstream-0.07 --add-module=../headers-more-nginx-module-0.33 --add-module=../array-var-nginx-module-0.05 --add-module=../memc-nginx-module-0.19 --add-module=../redis2-nginx-module-0.15 --add-module=../redis-nginx-module-0.3.9 --add-module=../ngx_stream_lua-0.0.11 --with-ld-opt='-Wl,-rpath,/usr/local/openresty/luajit/lib -Wl,-rpath,/usr/local/openresty/wasmtime-c-api/lib -L/usr/local/openresty/zlib/lib -L/usr/local/openresty/pcre/lib -L/usr/local/openresty/openssl111/lib -Wl,-rpath,/usr/local/openresty/zlib/lib:/usr/local/openresty/pcre/lib:/usr/local/openresty/openssl111/lib' --add-module=/tmp/tmp.kOo9UFlgbS/openresty-1.21.4.1/../mod_dubbo-1.0.2 --add-module=/tmp/tmp.kOo9UFlgbS/openresty-1.21.4.1/../ngx_multi_upstream_module-1.1.1 --add-module=/tmp/tmp.kOo9UFlgbS/openresty-1.21.4.1/../apisix-nginx-module-1.12.0 --add-module=/tmp/tmp.kOo9UFlgbS/openresty-1.21.4.1/../apisix-nginx-module-1.12.0/src/stream --add-module=/tmp/tmp.kOo9UFlgbS/openresty-1.21.4.1/../apisix-nginx-module-1.12.0/src/meta --add-module=/tmp/tmp.kOo9UFlgbS/openresty-1.21.4.1/../wasm-nginx-module-0.6.4 --add-module=/tmp/tmp.kOo9UFlgbS/openresty-1.21.4.1/../lua-var-nginx-module-v0.5.3 --add-module=/tmp/tmp.kOo9UFlgbS/openresty-1.21.4.1/../grpc-client-nginx-module-v0.4.2 --with-poll_module --with-pcre-jit --with-stream --with-stream_ssl_module --with-stream_ssl_preread_module --with-http_v2_module --without-mail_pop3_module --without-mail_imap_module --without-mail_smtp_module --with-http_stub_status_module --with-http_realip_module --with-http_addition_module --with-http_auth_request_module --with-http_secure_link_module --with-http_random_index_module --with-http_gzip_static_module --with-http_sub_module --with-http_dav_module --with-http_flv_module --with-http_mp4_module --with-http_gunzip_module --with-threads --with-compat --with-stream --with-http_ssl_module

- etcd version, if relevant (run `curl http://127.0.0.1:9090/v1/server_info`):

bitnami/etcd:3.5.9-2


- APISIX Dashboard version, if relevant:

apache/apisix-dashboard:3.0.1

ShuLian1984 commented 1 year ago

docker-compose.yml

version: '2.4'
services:
  etcd:
    container_name: etcd
    image: bitnami/etcd:3.5.9-2
    restart: always
    environment:
      ETCD_ENABLE_V2: "true"
      ALLOW_NONE_AUTHENTICATION: "yes"
      ETCD_ADVERTISE_CLIENT_URLS: "http://etcd:2379"
      ETCD_LISTEN_CLIENT_URLS: "http://0.0.0.0:2379"
    volumes:
      - "./data:/bitnami/etcd/data"
    ports:
      - "2379:2379/tcp"
    networks:
      apisix:
        ipv4_address: 172.64.0.2

  apisix:
    container_name: apisix
    image: apache/apisix:3.3.0
    restart: always
    environment:
      - TZ=Asia/Shanghai
    volumes:
      - ./config/apisix.yaml:/usr/local/apisix/conf/config.yaml:ro
      - ./apisix:/usr/local/apisix/apisix
      - ./logs:/usr/local/apisix/logs:rw
    ports:
      - "19180:9180/tcp" # admin api port
      - "19080:9080/tcp" # http port
      - "19091:9091/tcp" # Prometheus port
      - "19443:9443/tcp" # https port
      - "19092:9092/tcp" # control port
      - "19100-19120:9100-9120/tcp" # tcp port
    depends_on:
      - etcd
    networks:
      apisix:
        ipv4_address: 172.64.0.3

  apisix-dashboard:
    container_name: apisix-dashboard
    image: apache/apisix-dashboard:latest
    restart: always
    environment:
      - TZ=Asia/Shanghai
    volumes:
      - ./config/dashboard.yaml:/usr/local/apisix-dashboard/conf/conf.yaml:ro
      - ./config/schema.json:/usr/local/apisix-dashboard/conf/schema.json:ro
      - ./logs:/usr/local/apisix-dashboard/logs:rw
    ports:
      - "19000:9000/tcp" # Dashboard port
    depends_on:
      - etcd
      - apisix
    networks:
      apisix:
        ipv4_address: 172.64.0.4

# docker network create apisix_net --subnet 172.64.0.0/16 --gateway 172.64.0.1
networks:
  apisix:
    external:
      name: apisix_net

my config.yaml

apisix:
  node_listen: 9080
  enable_ipv6: false
  enable_control: true
  control:
    ip: "0.0.0.0"
    port: 9092
  stream_proxy:
    only: false
    tcp:
      - 9100
      - 9101
      - 9102
      - 9103
      - 9104
      - 9105
      - 9106
      - 9107
      - 9108
      - 9109
      - 9110
      - 9111
      - 9112
      - 9113
      - 9114
      - 9115
      - 9116
      - 9117
      - 9118
      - 9119
      - 9120
  # extra_lua_path: "/usr/local/apisix/apisix/ext_plugins/?.lua" 

deployment:
  role: traditional
  role_traditional:
    config_provider: etcd
  admin:
    allow_admin:
      - 0.0.0.0/0
    admin_key:
      - name: "admin"
        key: edd1c9f034335f136f87ad84b625c8f1
        role: admin
      - name: "viewer"
        key: 4054f7cf07e344346cd3f287985e76a2
        role: viewer

    enable_admin_cors: true
    admin_listen:
      ip: 0.0.0.0
      port: 9180

  etcd:
    host:
      - "http://etcd:2379"
    prefix: "/apisix"
    timeout: 30

plugins:
- server-info
- skywalking
- skywalking-logger

plugin_attr:

  prometheus:
    export_uri: /apisix/metrics
    export_addr:
      ip: "0.0.0.0"
      port: 9091
  skywalking:
    service_name: dev::APISIX
    service_instance_name: 10.10.85.14
    endpoint_addr: http://10.10.85.14:12800

discovery:
  nacos:
    host:
      - "http://nacos:nacos@10.10.85.14:8848"
    prefix: "/nacos/v1/"
    fetch_interval: 30
    weight: 100
    timeout:
      connect: 2000
      send: 2000
      read: 5000

Route information obtained from the admin API

{
    "total": 5,
    "list": [
        {
            "createdIndex": 3,
            "modifiedIndex": 1973,
            "key": "/apisix/routes/"
        },
        {
            "createdIndex": 1470,
            "value": {
                "id": "468393803724620526",
                "uri": "/device-transport/mqtt/*",
                "upstream": {
                    "pass_host": "pass",
                    "scheme": "http",
                    "nodes": [
                        {
                            "host": "10.10.19.179",
                            "port": 9988,
                            "weight": 1
                        }
                    ],
                    "timeout": {
                        "send": 6,
                        "read": 6,
                        "connect": 6
                    },
                    "type": "roundrobin",
                    "keepalive_pool": {
                        "requests": 1000,
                        "size": 320,
                        "idle_timeout": 60
                    }
                },
                "name": "device-transport",
                "status": 1,
                "create_time": 1688714062,
                "plugins": {
                    "ucsp-auth": {
                        "secret": "7b5cde0a41bf4588983e7e1685492b29",
                        "key": "1234567890abcdef",
                        "secret_map": {
                            "100080012": "56e608c295d4415b88938e290c507953",
                            "970010016": "7b5cde0a41bf4588983e7e1685492b29"
                        },
                        "redis": {
                            "idle_time": 20000,
                            "redis_host": "10.10.85.14",
                            "redis_password": "ubt83474428",
                            "redis_port": "19100",
                            "pool_size": 20,
                            "redis_ssl": "",
                            "redis_ssl_verify": "",
                            "redis_timeout": 1000,
                            "redis_database": "0"
                        },
                        "_meta": {
                            "disable": false
                        },
                        "iv": "1234567890123456"
                    }
                },
                "methods": [
                    "GET",
                    "POST",
                    "PUT",
                    "DELETE",
                    "PATCH",
                    "HEAD",
                    "OPTIONS",
                    "CONNECT",
                    "TRACE",
                    "PURGE"
                ],
                "update_time": 1689325213
            },
            "key": "/apisix/routes/468393803724620526",
            "modifiedIndex": 1918
        },
        {
            "createdIndex": 1659,
            "value": {
                "update_time": 1689304188,
                "id": "468965470297391854",
                "status": 0,
                "upstream": {
                    "pass_host": "pass",
                    "scheme": "http",
                    "nodes": [
                        {
                            "host": "10.10.85.14",
                            "port": 8848,
                            "weight": 1
                        }
                    ],
                    "timeout": {
                        "send": 6,
                        "read": 6,
                        "connect": 6
                    },
                    "type": "roundrobin",
                    "keepalive_pool": {
                        "requests": 1000,
                        "size": 320,
                        "idle_timeout": 60
                    }
                },
                "create_time": 1689054802,
                "uri": "/nacos/*",
                "methods": [
                    "GET",
                    "POST",
                    "PUT",
                    "DELETE",
                    "PATCH",
                    "HEAD",
                    "OPTIONS",
                    "CONNECT",
                    "TRACE",
                    "PURGE"
                ],
                "name": "nacos"
            },
            "key": "/apisix/routes/468965470297391854",
            "modifiedIndex": 1810
        },
        {
            "createdIndex": 1751,
            "value": {
                "id": "469259980801835758",
                "uri": "/usercenter/*",
                "name": "usercenter-local",
                "upstream_id": "469259855744467694",
                "status": 1,
                "host": "ucsp-dev.ubtrobot.com",
                "methods": [
                    "GET",
                    "POST",
                    "PUT",
                    "DELETE",
                    "PATCH",
                    "HEAD",
                    "OPTIONS",
                    "CONNECT",
                    "TRACE",
                    "PURGE"
                ],
                "create_time": 1689230344,
                "update_time": 1689325903
            },
            "key": "/apisix/routes/469259980801835758",
            "modifiedIndex": 1933
        },
        {
            "createdIndex": 1752,
            "value": {
                "update_time": 1689325908,
                "id": "469260008534573806",
                "status": 1,
                "uri": "/usercenter/*",
                "create_time": 1689230361,
                "methods": [
                    "GET",
                    "POST",
                    "PUT",
                    "DELETE",
                    "PATCH",
                    "HEAD",
                    "OPTIONS",
                    "CONNECT",
                    "TRACE",
                    "PURGE"
                ],
                "upstream_id": "469259940989502190",
                "name": "usercenter"
            },
            "key": "/apisix/routes/469260008534573806",
            "modifiedIndex": 1934
        }
    ]
}

Access Results

[dev@bigdata-304 apisix]$ curl http://10.10.85.14:19080/device-transport/mqtt/v1/test/gateway
{"error_msg":"404 Route Not Found"}

apisix access.log

10.10.85.14 - - [17/Jul/2023:11:06:35 +0800] 10.10.85.14:19080 "GET /device-transport/mqtt/v1/test/gateway HTTP/1.1" 404 47 0.000 "-" "curl/7.29.0" - - - "http://10.10.85.14:19080"

the error.log no records for this request.

ShuLian1984 commented 1 year ago

docker info

Client:
 Debug Mode: false

Server:
 Containers: 23
  Running: 23
  Paused: 0
  Stopped: 0
 Images: 60
 Server Version: 19.03.8
 Storage Driver: overlay2
  Backing Filesystem: <unknown>
  Supports d_type: true
  Native Overlay Diff: true
 Logging Driver: json-file
 Cgroup Driver: cgroupfs
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
 Swarm: inactive
 Runtimes: nvidia runc
 Default Runtime: nvidia
 Init Binary: docker-init
 containerd version: 7ad184331fa3e55e52b890ea95e65ba581ae3429
 runc version: dc9208a3303feef5b3839f4323d9beb36df0a9dd
 init version: fec3683
 Security Options:
  seccomp
   Profile: default
 Kernel Version: 3.10.0-1062.18.1.el7.x86_64
 Operating System: CentOS Linux 7 (Core)
 OSType: linux
 Architecture: x86_64
 CPUs: 40
 Total Memory: 251.4GiB
 Name: bigdata-304
 ID: HPEO:YAQK:FDGR:QGRR:FGUL:CXRO:WSOC:QRDG:UMB2:B3CU:5UCX:M5LZ
 Docker Root Dir: /var/lib/docker
 Debug Mode: false
 Registry: https://index.docker.io/v1/
 Labels:
 Experimental: false
 Insecure Registries:
  127.0.0.0/8
 Live Restore Enabled: false

docker-compose verison

docker-compose version 1.21.2, build a133471
Revolyssup commented 1 year ago

@ShuLian1984 Is there any error log like failed to create etcd instance for fetching /routes when your APISIX restarts? Or any other log at the start of the restart? Can you check if there are active connections to etcd, specifically on the /apisix/routes when the container restarts? And just to confirm, when you reapply the routes, it works again, right?

ShuLian1984 commented 1 year ago

steps to reproduce: step.1 an normal apisix container step.2 restart it step.3 the route rule becomes invalid

not have failed to create etcd instance for fetching,

error.log

2023/07/17 14:22:10 [warn] 82#82: *8 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 49#49: *5 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 51#51: *2 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 48#48: *4 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 56#56: *10 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 68#68: *6 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 57#57: *11 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 82#82: *8 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 116#116: *19 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 75#75: *20 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 49#49: *5 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 59#59: *9 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 51#51: *2 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 54#54: *18 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 56#56: *10 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 68#68: *6 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 48#48: *4 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 70#70: *12 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 145#145: *27 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 100#100: *22 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 87#87: *16 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 105#105: *23 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 189#189: *25 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 57#57: *11 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 75#75: *20 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 116#116: *19 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 137#137: *32 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 179#179: *21 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 257#257: *36 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 59#59: *9 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 55#55: *14 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 70#70: *12 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 54#54: *18 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 105#105: *23 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 145#145: *27 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 100#100: *22 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 189#189: *25 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 250#250: *33 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 87#87: *16 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 58#58: *7 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 137#137: *32 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 201#201: *28 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 131#131: *17 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 179#179: *21 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 164#164: *41 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 158#158: *37 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 50#50: *15 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 257#257: *36 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 55#55: *14 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 243#243: *40 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 250#250: *33 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 58#58: *7 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 164#164: *41 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 131#131: *17 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 158#158: *37 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 50#50: *15 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 192#192: *29 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 201#201: *28 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 215#215: *30 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 243#243: *40 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 52#52: *1 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 222#222: *35 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 192#192: *29 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 126#126: *38 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 215#215: *30 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 53#53: *13 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 62#62: *26 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 52#52: *1 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 222#222: *35 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 62#62: *26 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 205#205: *24 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 73#73: *3 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 126#126: *38 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 193#193: *31 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 73#73: *3 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 53#53: *13 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 205#205: *24 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 237#237: *34 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 49#49: *42 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 193#193: *31 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 82#82: *43 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 230#230: *39 [lua] plugin.lua:202: load(): new plugins: {"skywalking":true,"skywalking-logger":true,"ucsp-auth":true,"server-info":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 51#51: *44 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 237#237: *34 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 48#48: *47 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 68#68: *46 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 57#57: *49 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 75#75: *48 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 116#116: *50 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 230#230: *39 [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"limit-conn":true,"ip-restriction":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 105#105: *52 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 100#100: *55 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 189#189: *56 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 145#145: *57 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 54#54: *54 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 70#70: *53 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 137#137: *58 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 257#257: *62 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 250#250: *63 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 87#87: *59 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 164#164: *64 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 58#58: *65 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 56#56: *45 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 201#201: *69 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 158#158: *67 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 59#59: *51 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 222#222: *74 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 55#55: *61 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 179#179: *60 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 131#131: *70 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 126#126: *75 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 243#243: *68 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 73#73: *76 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 52#52: *72 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 215#215: *73 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 192#192: *71 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 50#50: *66 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 205#205: *78 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 53#53: *77 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 230#230: *202 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 62#62: *446 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 237#237: *124 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023/07/17 14:22:10 [warn] 193#193: *827 stream [lua] plugin.lua:252: load_stream(): new plugins: {"mqtt-proxy":true,"syslog":true,"ip-restriction":true,"limit-conn":true}, context: init_worker_by_lua*
2023-07-17T14:22:39.641+0800    WARN    store/store.go:128      data not found by key: 469259980801835758
2023-07-17T14:22:39.765+0800    WARN    store/store.go:128      data not found by key: 468393803724620526
2023-07-17T14:22:39.765+0800    WARN    store/store.go:128      data not found by key: 468965470297391854
2023-07-17T14:22:39.765+0800    WARN    store/store.go:128      data not found by key: 469259980801835758
2023-07-17T14:22:39.765+0800    WARN    store/store.go:128      data not found by key: 469260008534573806

like 469259980801835758 is my route id

shreemaan-abhishek commented 1 year ago
volumes:
 - "./data:/bitnami/etcd/data"

@ShuLian1984 are you sure that the ./data directory remains unchanged when you restart the container?

And could you please share how you restart the container?

ShuLian1984 commented 1 year ago

Nothing, only executing docker restart apisix.

the route data obtained in the admin API is correct and not lost. reapply the routes, and it works again.

this is OS log when I restart apisix container

/var/log/message.

Jul 19 11:16:58 bigdata-304 systemd-logind: Removed session 168974.
Jul 19 11:16:59 bigdata-304 dockerd: time="2023-07-19T11:16:59.319585250+08:00" level=info msg="Container d55b9d0ad5d0af1deec43fa219411a710086486f764554844e214983878acbb4 failed to exit within 10 seconds of signal 3 - using the force"
Jul 19 11:16:59 bigdata-304 containerd: time="2023-07-19T11:16:59.533493009+08:00" level=info msg="shim reaped" id=d55b9d0ad5d0af1deec43fa219411a710086486f764554844e214983878acbb4
Jul 19 11:16:59 bigdata-304 dockerd: time="2023-07-19T11:16:59.541915248+08:00" level=info msg="ignoring event" module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 19 11:17:00 bigdata-304 kernel: br-ca6f814b9dff: port 2(vethd684d8c) entered disabled state
Jul 19 11:17:00 bigdata-304 kernel: br-ca6f814b9dff: port 2(vethd684d8c) entered disabled state
Jul 19 11:17:00 bigdata-304 kernel: device vethd684d8c left promiscuous mode
Jul 19 11:17:00 bigdata-304 kernel: br-ca6f814b9dff: port 2(vethd684d8c) entered disabled state
Jul 19 11:17:00 bigdata-304 libvirtd: 2023-07-19 03:17:00.210+0000: 4146: error : virNetDevSendEthtoolIoctl:3078 : ethtool ioctl error: No such device
Jul 19 11:17:00 bigdata-304 libvirtd: 2023-07-19 03:17:00.215+0000: 4146: error : virNetDevSendEthtoolIoctl:3078 : ethtool ioctl error: No such device
Jul 19 11:17:00 bigdata-304 libvirtd: 2023-07-19 03:17:00.219+0000: 4146: error : virNetDevSendEthtoolIoctl:3078 : ethtool ioctl error: No such device
Jul 19 11:17:00 bigdata-304 libvirtd: 2023-07-19 03:17:00.222+0000: 4146: error : virNetDevSendEthtoolIoctl:3078 : ethtool ioctl error: No such device
Jul 19 11:17:00 bigdata-304 libvirtd: 2023-07-19 03:17:00.224+0000: 4146: error : virNetDevSendEthtoolIoctl:3078 : ethtool ioctl error: No such device
Jul 19 11:17:00 bigdata-304 libvirtd: 2023-07-19 03:17:00.227+0000: 4146: error : virNetDevSendEthtoolIoctl:3078 : ethtool ioctl error: No such device
Jul 19 11:17:00 bigdata-304 libvirtd: 2023-07-19 03:17:00.231+0000: 4146: error : virNetDevSendEthtoolIoctl:3078 : ethtool ioctl error: No such device
Jul 19 11:17:00 bigdata-304 libvirtd: 2023-07-19 03:17:00.234+0000: 4146: error : virNetDevSendEthtoolIoctl:3078 : ethtool ioctl error: No such device
Jul 19 11:17:00 bigdata-304 kernel: br-ca6f814b9dff: port 2(veth6beb84b) entered blocking state
Jul 19 11:17:00 bigdata-304 kernel: br-ca6f814b9dff: port 2(veth6beb84b) entered disabled state
Jul 19 11:17:00 bigdata-304 kernel: device veth6beb84b entered promiscuous mode
Jul 19 11:17:00 bigdata-304 kernel: IPv6: ADDRCONF(NETDEV_UP): veth6beb84b: link is not ready
Jul 19 11:17:00 bigdata-304 kernel: br-ca6f814b9dff: port 2(veth6beb84b) entered blocking state
Jul 19 11:17:00 bigdata-304 kernel: br-ca6f814b9dff: port 2(veth6beb84b) entered forwarding state
Jul 19 11:17:00 bigdata-304 containerd: time="2023-07-19T11:17:00.910653678+08:00" level=info msg="shim containerd-shim started" address="/containerd-shim/moby/d55b9d0ad5d0af1deec43fa219411a710086486f764554844e214983878acbb4/shim.sock" debug=false pid=224533
Jul 19 11:17:01 bigdata-304 kernel: br-ca6f814b9dff: port 2(veth6beb84b) entered disabled state
Jul 19 11:17:01 bigdata-304 kernel: IPv6: ADDRCONF(NETDEV_UP): eth0: link is not ready
Jul 19 11:17:01 bigdata-304 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready
Jul 19 11:17:01 bigdata-304 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): veth6beb84b: link becomes ready
Jul 19 11:17:01 bigdata-304 kernel: br-ca6f814b9dff: port 2(veth6beb84b) entered blocking state
Jul 19 11:17:01 bigdata-304 kernel: br-ca6f814b9dff: port 2(veth6beb84b) entered forwarding state
Jul 19 11:17:01 bigdata-304 systemd: Started Session 169036 of user root.
Sn0rt commented 1 year ago

take a look

Revolyssup commented 1 year ago

@ShuLian1984 Can you confirm that when you are not able to access the route, the routes are present in your etcd? You can use etcdctl to GET the keys to figure out.

Revolyssup commented 1 year ago

@ShuLian1984 When you are not able to access route, can you also check in the control API at /v1/routes to see if your route is not present.

Revolyssup commented 1 year ago

@ShuLian1984 I also see your image tag doesn't have the suffix of distro like centos or debian. Is the image locally built or Have you tagged the original image with removed suffix? What's the distro of the image?

shreemaan-abhishek commented 1 year ago

I could not reproduce this issue using the docker-compose example provided here:

https://github.com/shreemaan-abhishek/apisix-docker/blob/90197afe9886d99f314a8e9753500bfcd91298c5/example/docker-compose-arm64.yml#L31-L45

ShuLian1984 commented 1 year ago

@ShuLian1984 Can you confirm that when you are not able to access the route, the routes are present in your etcd? You can use etcdctl to GET the keys to figure out.

I can read data through the etcd manager tool, or admin api.

etcd image from bitnami/etcd:latest, I just re labeled tags based on the specific version.but I can't find it again. current image bitnami/etcd:latest is 3.5.9-5, no 3.5.9-2

IMAGE IMAGE ID
apache/apisix:3.3.0 f63abe934218
apache/apisix-dashboard;latest 2f0aed4002d7
bitnami/etcd:3.5.9-2 43329993da1d

docker history

IMAGE               CREATED             CREATED BY                                      SIZE                COMMENT
43329993da1d        292 years ago       crane flatten sha256:502b3e350d39adaafe7dbd3…   144MB               [{"created":"2023-05-12T00:10:11.513452739Z","comment":"from Bitnami with love"},{"created":"2023-05-22T10:23:43.221972322Z","created_by":"ARG TARGETARCH","comment":"buildkit.dockerfile.v0","empty_layer":true},{"created":"2023-05-22T10:23:43.221972322Z","created_by":"LABEL org.opencontainers.image.base.name=docker.io/bitnami/minideb:bullseye org.opencontainers.image.created=2023-05-22T10:22:39Z org.opencontainers.image.description=Application packaged by VMware, Inc org.opencontainers.image.licenses=Apache-2.0 org.opencontainers.image.ref.name=3.5.9-debian-11-r5 org.opencontainers.image.title=etcd org.opencontainers.image.vendor=VMware, Inc. org.opencontainers.image.version=3.5.9","comment":"buildkit.dockerfile.v0","empty_layer":true},{"created":"2023-05-22T10:23:43.221972322Z","created_by":"ENV HOME=/ OS_ARCH=amd64 OS_FLAVOUR=debian-11 OS_NAME=linux","comment":"buildkit.dockerfile.v0","empty_layer":true},{"created":"2023-05-22T10:23:43.221972322Z","created_by":"COPY prebuildfs / # buildkit","comment":"buildkit.dockerfile.v0"},{"created":"2023-05-22T10:23:43.221972322Z","created_by":"SHELL [/bin/bash -o pipefail -c]","comment":"buildkit.dockerfile.v0","empty_layer":true},{"created":"2023-05-22T10:23:47.195163054Z","created_by":"RUN |1 TARGETARCH=amd64 /bin/bash -o pipefail -c install_packages ca-certificates curl procps # buildkit","comment":"buildkit.dockerfile.v0"},{"created":"2023-05-22T10:23:48.271268991Z","created_by":"RUN |1 TARGETARCH=amd64 /bin/bash -o pipefail -c mkdir -p /tmp/bitnami/pkg/cache/ \u0026\u0026 cd /tmp/bitnami/pkg/cache/ \u0026\u0026     COMPONENTS=(       \"yq-4.33.3-1-linux-${OS_ARCH}-debian-11\"       \"etcd-3.5.9-2-linux-${OS_ARCH}-debian-11\"     ) \u0026\u0026     for COMPONENT in \"${COMPONENTS[@]}\"; do       if [ ! -f \"${COMPONENT}.tar.gz\" ]; then         curl -SsLf \"https://downloads.bitnami.com/files/stacksmith/${COMPONENT}.tar.gz\" -O ;         curl -SsLf \"https://downloads.bitnami.com/files/stacksmith/${COMPONENT}.tar.gz.sha256\" -O ;       fi \u0026\u0026       sha256sum -c \"${COMPONENT}.tar.gz.sha256\" \u0026\u0026       tar -zxf \"${COMPONENT}.tar.gz\" -C /opt/bitnami --strip-components=2 --no-same-owner --wildcards '*/files' \u0026\u0026       rm -rf \"${COMPONENT}\".tar.gz{,.sha256} ;     done # buildkit","comment":"buildkit.dockerfile.v0"},{"created":"2023-05-22T10:23:50.864312287Z","created_by":"RUN |1 TARGETARCH=amd64 /bin/bash -o pipefail -c apt-get autoremove --purge -y curl ca-certificates \u0026\u0026     apt-get update \u0026\u0026 apt-get upgrade -y \u0026\u0026     apt-get clean \u0026\u0026 rm -rf /var/lib/apt/lists /var/cache/apt/archives # buildkit","comment":"buildkit.dockerfile.v0"},{"created":"2023-05-22T10:23:50.948231767Z","created_by":"RUN |1 TARGETARCH=amd64 /bin/bash -o pipefail -c chmod g+rwX /opt/bitnami # buildkit","comment":"buildkit.dockerfile.v0"},{"created":"2023-05-22T10:23:50.982097118Z","created_by":"COPY rootfs / # buildkit","comment":"buildkit.dockerfile.v0"},{"created":"2023-05-22T10:23:51.077432534Z","created_by":"RUN |1 TARGETARCH=amd64 /bin/bash -o pipefail -c /opt/bitnami/scripts/etcd/postunpack.sh # buildkit","comment":"buildkit.dockerfile.v0"},{"created":"2023-05-22T10:23:51.077432534Z","created_by":"ENV APP_VERSION=3.5.9 BITNAMI_APP_NAME=etcd ETCDCTL_API=3 PATH=/opt/bitnami/common/bin:/opt/bitnami/etcd/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin","comment":"buildkit.dockerfile.v0","empty_layer":true},{"created":"2023-05-22T10:23:51.077432534Z","created_by":"EXPOSE map[2379/tcp:{} 2380/tcp:{}]","comment":"buildkit.dockerfile.v0","empty_layer":true},{"created":"2023-05-22T10:23:51.112488339Z","created_by":"WORKDIR /opt/bitnami/etcd","comment":"buildkit.dockerfile.v0"},{"created":"2023-05-22T10:23:51.112488339Z","created_by":"USER 1001","comment":"buildkit.dockerfile.v0","empty_layer":true},{"created":"2023-05-22T10:23:51.112488339Z","created_by":"ENTRYPOINT [\"/opt/bitnami/scripts/etcd/entrypoint.sh\"]","comment":"buildkit.dockerfile.v0","empty_layer":true},{"created":"2023-05-22T10:23:51.112488339Z","created_by":"CMD [\"/opt/bitnami/scripts/etcd/run.sh\"]","comment":"buildkit.dockerfile.v0","empty_layer":true}]
ShuLian1984 commented 1 year ago

how to fix it?

shreemaan-abhishek commented 1 year ago

@ShuLian1984, we are unable to reproduce this issue.

ShuLian1984 commented 1 year ago

Thank you everyone.

This issue cannot be reproduced on other servers, Maybe it's because my environment is too complex,given up it.

AlinsRan commented 1 year ago

OK, I will close this issue, if there is any progress, you can reopen it.

jujiale commented 5 months ago

@ShuLian1984 Is there any error log like failed to create etcd instance for fetching /routes when your APISIX restarts? Or any other log at the start of the restart? Can you check if there are active connections to etcd, specifically on the /apisix/routes when the container restarts? And just to confirm, when you reapply the routes, it works again, right?

@ShuLian1984 hello, we suffered such a situation as you metioned above, when we use apisix-dashboard modify the route config, we found that in etcd, the config is the same as we modified(and the update_time is correct), but when I invoke /v1/route/{route id}, it is old config, and the update_time is very long before since I modified recently, error log like below: `

  2024/07/04 16:00:55 [error] 16235#16235: *143280176 [lua] config_util.lua:86: failed to find clean_handler with idx 1, client: 172.24.61.47, server: _, request: "POST /xxxxx/epl HTTP/1.1", host: "xxxxx"
2024/07/04 16:00:55 [error] 16240#16240: *143284010 [lua] config_etcd.lua:584: failed to fetch data from etcd: /usr/local/apisix/core/config_util.lua:104: attempt to index local 'item' (a boolean value)
stack traceback:
    /usr/local/apisix/core/config_util.lua:104: in function 'fire_all_clean_handlers'
    /usr/local/apisix/core/config_etcd.lua:315: in function 'sync_data'
    /usr/local/apisix/core/config_etcd.lua:541: in function </usr/local/apisix/core/config_etcd.lua:532>
    [C]: in function 'xpcall'
    /usr/local/apisix/core/config_etcd.lua:532: in function </usr/local/apisix/core/config_etcd.lua:513>,  etcd key: /test1/apisix/routes, context: ngx.timer

` I also find in apisix-etcd capture /v3/watch, many request is beyond 30s, triggerd timeout.

I want to know why such situation could happen, and if community has resolved it, thanks

use apisix 2.15.0 etcd 3.5.0