Kong / kong

🦍 The Cloud-Native API Gateway and AI Gateway.
https://konghq.com/install/#kong-community
Apache License 2.0
38.95k stars 4.78k forks source link

[2.1.4][DBLess]Can't load correct ssl cert during reload #6464

Closed debu99 closed 3 years ago

debu99 commented 3 years ago

our ssl certificate is in the yaml configuration file, but after reload, kong uses default cert

2020-10-13 13:32:29.840 14786-14990/com.xxx.selfcare.debug D/OkHttp:     certificate: sha256/OG40oLN/ik+YfJsq/Upj6l7nFyh+NMjKaFe42k2JX4k=
2020-10-13 13:32:29.840 14786-14990/com.xxx.selfcare.debug D/OkHttp:     DN: CN=localhost,OU=IT Department,O=Kong,L=San Francisco,ST=California,C=US
2020/10/13 01:59:18 [notice] 398#0: signal 1 (SIGHUP) received from 23977, reconfiguring
2020/10/13 01:59:18 [notice] 398#0: reconfiguring
2020/10/13 01:59:18 [warn] 398#0: the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /usr/local/kong/nginx.conf:6
2020/10/13 01:59:18 [warn] 398#0: load balancing method redefined in /usr/local/kong/nginx-kong.conf:63
2020/10/13 01:59:19 [notice] 398#0: using the "epoll" event method
2020/10/13 01:59:19 [notice] 398#0: start worker processes
2020/10/13 01:59:19 [notice] 398#0: start worker process 23987
2020/10/13 01:59:19 [notice] 23987#0: *8865856 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 01:59:19 [notice] 23987#0: *8865856 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 01:59:19 [notice] 23987#0: *8865856 [kong] init.lua:329 declarative config loaded from /etc/xxx/api-gateway/api-gateway.yml, context: init_worker_by_lua*
2020/10/13 01:59:19 [notice] 398#0: signal 17 (SIGCHLD) received from 23979
2020/10/13 01:59:19 [notice] 20002#0: gracefully shutting down
2020/10/13 01:59:19 [notice] 20002#0: exiting
2020/10/13 01:59:19 [notice] 20002#0: exit
2020/10/13 01:59:19 [notice] 398#0: signal 17 (SIGCHLD) received from 20002
2020/10/13 01:59:19 [notice] 398#0: worker process 20002 exited with code 0
2020/10/13 01:59:19 [notice] 398#0: signal 29 (SIGIO) received
2020/10/13 02:00:01 [notice] 398#0: signal 1 (SIGHUP) received from 24019, reconfiguring
2020/10/13 02:00:01 [notice] 398#0: reconfiguring
2020/10/13 02:00:01 [warn] 398#0: the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /usr/local/kong/nginx.conf:6
2020/10/13 02:00:01 [warn] 398#0: load balancing method redefined in /usr/local/kong/nginx-kong.conf:63
2020/10/13 02:00:03 [notice] 398#0: using the "epoll" event method
2020/10/13 02:00:03 [notice] 398#0: start worker processes
2020/10/13 02:00:03 [notice] 398#0: start worker process 24029
2020/10/13 02:00:03 [notice] 24029#0: *8866096 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 02:00:03 [notice] 24029#0: *8866096 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 02:00:04 [notice] 398#0: signal 17 (SIGCHLD) received from 24020
2020/10/13 02:00:04 [notice] 23987#0: gracefully shutting down
2020/10/13 02:00:04 [notice] 23987#0: exiting
2020/10/13 02:00:03 [notice] 24029#0: *8866096 [kong] init.lua:329 declarative config loaded from /etc/xxx/api-gateway/api-gateway.yml, context: init_worker_by_lua*
2020/10/13 02:00:04 [notice] 23987#0: exit
2020/10/13 02:00:04 [notice] 398#0: signal 17 (SIGCHLD) received from 23987
2020/10/13 02:00:04 [notice] 398#0: worker process 23987 exited with code 0
2020/10/13 02:00:04 [notice] 398#0: signal 29 (SIGIO) received
2020/10/13 02:05:56 [notice] 398#0: signal 1 (SIGHUP) received from 24095, reconfiguring
2020/10/13 02:05:56 [notice] 398#0: reconfiguring
2020/10/13 02:05:56 [warn] 398#0: the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /usr/local/kong/nginx.conf:6
2020/10/13 02:05:56 [warn] 398#0: load balancing method redefined in /usr/local/kong/nginx-kong.conf:63
2020/10/13 02:05:56 [notice] 398#0: using the "epoll" event method
2020/10/13 02:05:56 [notice] 398#0: start worker processes
2020/10/13 02:05:56 [notice] 398#0: start worker process 24105
2020/10/13 02:05:56 [notice] 24105#0: *8866951 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 02:05:56 [notice] 24105#0: *8866951 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 02:05:56 [notice] 24105#0: *8866951 [kong] init.lua:329 declarative config loaded from /etc/xxx/api-gateway/api-gateway.yml, context: init_worker_by_lua*
2020/10/13 02:05:56 [notice] 398#0: signal 17 (SIGCHLD) received from 24097
2020/10/13 02:05:56 [notice] 24029#0: gracefully shutting down
2020/10/13 02:05:56 [notice] 24029#0: exiting
2020/10/13 02:05:56 [notice] 24029#0: exit
2020/10/13 02:05:56 [notice] 398#0: signal 17 (SIGCHLD) received from 24029
2020/10/13 02:05:56 [notice] 398#0: worker process 24029 exited with code 0
2020/10/13 02:05:56 [notice] 398#0: signal 29 (SIGIO) received
2020/10/13 02:07:25 [notice] 398#0: signal 1 (SIGHUP) received from 24140, reconfiguring
2020/10/13 02:07:25 [notice] 398#0: reconfiguring
2020/10/13 02:07:25 [warn] 398#0: the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /usr/local/kong/nginx.conf:6
2020/10/13 02:07:25 [warn] 398#0: load balancing method redefined in /usr/local/kong/nginx-kong.conf:63
2020/10/13 02:07:25 [notice] 398#0: using the "epoll" event method
2020/10/13 02:07:25 [notice] 398#0: start worker processes
2020/10/13 02:07:25 [notice] 398#0: start worker process 24150
2020/10/13 02:07:25 [notice] 24150#0: *8867260 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 02:07:25 [notice] 24150#0: *8867260 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 02:07:25 [notice] 24150#0: *8867260 [kong] init.lua:329 declarative config loaded from /etc/xxx/api-gateway/api-gateway.yml, context: init_worker_by_lua*
2020/10/13 02:07:25 [notice] 398#0: signal 17 (SIGCHLD) received from 24142
2020/10/13 02:07:25 [notice] 24105#0: gracefully shutting down
2020/10/13 02:07:25 [notice] 24105#0: exiting
2020/10/13 02:07:25 [notice] 24105#0: exit
2020/10/13 02:07:25 [notice] 398#0: signal 17 (SIGCHLD) received from 24105
2020/10/13 02:07:25 [notice] 398#0: worker process 24105 exited with code 0
2020/10/13 02:07:25 [notice] 398#0: signal 29 (SIGIO) received
2020/10/13 02:49:38 [error] 24150#0: *8873104 [lua] certificate.lua:38: log(): [ssl] failed to fetch SNI: failed to fetch '54.251.150.140:6443' SNI: [off] must not be an IP, context: ssl_certificate_by_lua*, client: 20.17.97.16, server: 0.0.0.0:6443
2020/10/13 02:49:38 [error] 24150#0: *8873104 [lua] certificate.lua:38: log(): [ssl] failed to fetch SNI: failed to fetch '*.251.150.140:6443' SNI: [off] must not have a port, context: ssl_certificate_by_lua*, client: 20.17.97.16, server: 0.0.0.0:6443
2020/10/13 04:15:49 [error] 24150#0: *8884859 [lua] certificate.lua:38: log(): [ssl] failed to fetch SNI: failed to fetch '54.251.150.140' SNI: [off] must not be an IP, context: ssl_certificate_by_lua*, client: 6.217.2.12, server: 0.0.0.0:443
2020/10/13 04:32:22 [error] 24150#0: *8887325 [lua] certificate.lua:38: log(): [ssl] failed to fetch SNI: failed to fetch '54.251.150.140:443' SNI: [off] must not be an IP, context: ssl_certificate_by_lua*, client: 20.17.97.42, server: 0.0.0.0:443
2020/10/13 04:32:22 [error] 24150#0: *8887325 [lua] certificate.lua:38: log(): [ssl] failed to fetch SNI: failed to fetch '*.251.150.140:443' SNI: [off] must not have a port, context: ssl_certificate_by_lua*, client: 20.17.97.42, server: 0.0.0.0:443
2020/10/13 05:02:08 [alert] 24150#0: *8891220 ignoring stale global SSL error (SSL: error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed), client: 169.232.190.9, server: kong, request: "GET /v2/sg/en/mobile/notification/history?push=true HTTP/1.1", host: "hidden.test.com:6443"
2020/10/13 05:02:08 [alert] 24150#0: *8891221 ignoring stale global SSL error (SSL: error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed), client: 169.232.190.9, server: kong, request: "GET /api/v1/quilt/page/non-telco-sistic?uuid=ACA43233-10AB-435F-901C-8275F50262A8 HTTP/1.1", host: "hidden.test.com:6443"
2020/10/13 05:02:08 [alert] 24150#0: *8891224 ignoring stale global SSL error (SSL: error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed), client: 169.232.190.9, server: kong, request: "GET /v4/sg/en/mobile/promotions/dashboard/?uuid=179ECDD2-E6D6-4D76-ACF2-04DF2D5DB80E HTTP/1.1", host: "hidden.test.com:6443"
2020/10/13 05:02:08 [alert] 24150#0: *8891225 ignoring stale global SSL error (SSL: error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed), client: 169.232.190.9, server: kong, request: "GET /v4/sg/en/mobile/promotions/dashboard/?uuid=179ECDD2-E6D6-4D76-ACF2-04DF2D5DB80E HTTP/1.1", host: "hidden.test.com:6443"
2020/10/13 05:02:31 [alert] 24150#0: *8891287 ignoring stale global SSL error (SSL: error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed), client: 169.232.190.9, server: kong, request: "GET /v2/sg/en/mobile/notification/history?push=true HTTP/1.1", host: "hidden.test.com:6443"
2020/10/13 05:29:23 [notice] 398#0: signal 1 (SIGHUP) received from 25841, reconfiguring
2020/10/13 05:29:23 [notice] 398#0: reconfiguring
2020/10/13 05:29:23 [warn] 398#0: the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /usr/local/kong/nginx.conf:6
2020/10/13 05:29:23 [warn] 398#0: load balancing method redefined in /usr/local/kong/nginx-kong.conf:63
2020/10/13 05:29:24 [notice] 398#0: using the "epoll" event method
2020/10/13 05:29:24 [notice] 398#0: start worker processes
2020/10/13 05:29:24 [notice] 398#0: start worker process 25855
2020/10/13 05:29:24 [notice] 25855#0: *8894872 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 05:29:24 [notice] 25855#0: *8894872 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 05:29:24 [notice] 25855#0: *8894872 [kong] init.lua:329 declarative config loaded from /etc/xxx/api-gateway/api-gateway.yml, context: init_worker_by_lua*
2020/10/13 05:29:24 [notice] 398#0: signal 17 (SIGCHLD) received from 25846
2020/10/13 05:29:24 [notice] 24150#0: gracefully shutting down
2020/10/13 05:29:27 [notice] 398#0: signal 1 (SIGHUP) received from 26145, reconfiguring
2020/10/13 05:29:27 [notice] 398#0: reconfiguring
2020/10/13 05:29:27 [warn] 398#0: the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /usr/local/kong/nginx.conf:6
2020/10/13 05:29:27 [warn] 398#0: load balancing method redefined in /usr/local/kong/nginx-kong.conf:63
2020/10/13 05:29:28 [notice] 398#0: using the "epoll" event method
2020/10/13 05:29:28 [notice] 398#0: start worker processes
2020/10/13 05:29:28 [notice] 398#0: start worker process 26155
2020/10/13 05:29:28 [notice] 26155#0: *8894929 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 05:29:28 [notice] 26155#0: *8894929 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 05:29:28 [notice] 26155#0: *8894929 [kong] init.lua:329 declarative config loaded from /etc/xxx/api-gateway/api-gateway.yml, context: init_worker_by_lua*
2020/10/13 05:29:28 [notice] 398#0: signal 17 (SIGCHLD) received from 26147
2020/10/13 05:29:28 [notice] 25855#0: gracefully shutting down
2020/10/13 05:29:28 [notice] 25855#0: exiting
2020/10/13 05:29:28 [notice] 25855#0: exit
2020/10/13 05:29:28 [notice] 398#0: signal 17 (SIGCHLD) received from 25855
2020/10/13 05:29:28 [notice] 398#0: worker process 25855 exited with code 0
2020/10/13 05:29:28 [notice] 398#0: signal 29 (SIGIO) received
2020/10/13 05:29:38 [notice] 24150#0: exiting
2020/10/13 05:29:38 [notice] 24150#0: exit
2020/10/13 05:29:38 [notice] 398#0: signal 17 (SIGCHLD) received from 24150
2020/10/13 05:29:38 [notice] 398#0: worker process 24150 exited with code 0
2020/10/13 05:29:38 [notice] 398#0: signal 29 (SIGIO) received
2020/10/13 05:30:26 [alert] 26155#0: *8895140 ignoring stale global SSL error (SSL: error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed), client: 169.232.190.9, server: kong, request: "GET /api/v1/quilt/page/non-telco-sistic?uuid=0ABCFFAA-75B2-4EB4-BDA9-0FBA9E73144F HTTP/1.1", host: "hidden.test.com:6443"
2020/10/13 05:30:26 [alert] 26155#0: *8895142 ignoring stale global SSL error (SSL: error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed), client: 169.232.190.9, server: kong, request: "GET /v2/sg/en/mobile/notification/history?push=true HTTP/1.1", host: "hidden.test.com:6443"
2020/10/13 05:30:26 [alert] 26155#0: *8895144 ignoring stale global SSL error (SSL: error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed), client: 169.232.190.9, server: kong, request: "GET /v4/sg/en/mobile/promotions/dashboard/?uuid=9E27A933-E5A3-473C-84F0-F905C6F2CC11 HTTP/1.1", host: "hidden.test.com:6443"
2020/10/13 05:30:26 [alert] 26155#0: *8895145 ignoring stale global SSL error (SSL: error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed), client: 169.232.190.9, server: kong, request: "GET /v4/sg/en/mobile/promotions/dashboard/?uuid=9E27A933-E5A3-473C-84F0-F905C6F2CC11 HTTP/1.1", host: "hidden.test.com:6443"
2020/10/13 05:45:59 [notice] 398#0: signal 15 (SIGTERM) received from 26824, exiting
2020/10/13 05:45:59 [notice] 26155#0: exiting
2020/10/13 05:45:59 [notice] 398#0: signal 15 (SIGTERM) received from 1, exiting
2020/10/13 05:45:59 [notice] 26155#0: signal 15 (SIGTERM) received from 1, exiting
2020/10/13 05:45:59 [notice] 26155#0: exit
2020/10/13 05:45:59 [notice] 398#0: signal 17 (SIGCHLD) received from 26155
2020/10/13 05:45:59 [notice] 398#0: worker process 26155 exited with code 0
2020/10/13 05:45:59 [notice] 398#0: exit
2020/10/13 05:45:59 [notice] 26860#0: using the "epoll" event method
2020/10/13 05:45:59 [notice] 26860#0: openresty/1.15.8.3
2020/10/13 05:45:59 [notice] 26860#0: built by gcc 7.5.0 (Ubuntu 7.5.0-3ubuntu1~18.04)
2020/10/13 05:45:59 [notice] 26860#0: OS: Linux 5.3.0-1019-aws
2020/10/13 05:45:59 [notice] 26860#0: getrlimit(RLIMIT_NOFILE): 65536:65536
2020/10/13 05:45:59 [notice] 26869#0: start worker processes
2020/10/13 05:45:59 [notice] 26869#0: start worker process 26870
2020/10/13 05:45:59 [notice] 26870#0: *1 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 05:45:59 [notice] 26870#0: *1 [lua] cache.lua:374: purge(): [DB cache] purging (local) cache, context: init_worker_by_lua*
2020/10/13 05:45:59 [notice] 26870#0: *1 [kong] init.lua:329 declarative config loaded from /etc/xxx/api-gateway/api-gateway.yml, context: init_worker_by_lua*
bungle commented 3 years ago

@debu99,

Can you show us how did you test it? Also can you show us how you started and how did you reload Kong (the exact commands), also can you show use a small example of the yaml (you can remove all the private stuff from there).

I tried to reproduce this but I could not yet.

debu99 commented 3 years ago

No, it is no tested on purpose, we tested some configuration and then reload the kong, we created kong as a systemd service, and use systemctl reload kong to reload it

[Unit] Description=kong service Documentation=https://docs.konghq.com After=syslog.target network.target

[Service] User=circlesuser Group=circlesuser Type=forking LimitAS=infinity LimitRSS=infinity LimitCORE=infinity LimitNOFILE=65536 ExecStart=/usr/local/bin/kong start -v --conf /etc/our_config/api-gateway/api-gateway.conf ExecReload=/usr/local/bin/kong reload -v --conf /etc/our_config/api-gateway/api-gateway.conf ExecStop=/usr/local/bin/kong stop

[Install] WantedBy=multi-user.target

`

bungle commented 3 years ago

@debu99, good news! I could reproduce this. I'll start looking at it.

bungle commented 3 years ago

Here is how I reproduced it:

  1. dbless.conf file:
database=off
declarative_config=dbless.yml
  1. dbless.yml file:
_transform: false
_format_version: '2.1'
certificates:
- created_at: 1604424735
  cert: |
    -----BEGIN CERTIFICATE-----
    MIIDzTCCArWgAwIBAgIUMmq4W4is+P02LXKinUdLoPjFuDYwDQYJKoZIhvcNAQEL
    BQAwdjELMAkGA1UEBhMCVVMxEzARBgNVBAgMCkNhbGlmb3JuaWExFjAUBgNVBAcM
    DVNhbiBGcmFuY2lzY28xIDAeBgNVBAoMF0tvbmcgQ2x1c3RlcmluZyBUZXN0aW5n
    MRgwFgYDVQQDDA9rb25nX2NsdXN0ZXJpbmcwHhcNMTkxMTEzMDU0NTA1WhcNMjkx
    MTEwMDU0NTA1WjB2MQswCQYDVQQGEwJVUzETMBEGA1UECAwKQ2FsaWZvcm5pYTEW
    MBQGA1UEBwwNU2FuIEZyYW5jaXNjbzEgMB4GA1UECgwXS29uZyBDbHVzdGVyaW5n
    IFRlc3RpbmcxGDAWBgNVBAMMD2tvbmdfY2x1c3RlcmluZzCCASIwDQYJKoZIhvcN
    AQEBBQADggEPADCCAQoCggEBALr7evXK3nLxW98lXDWUcyNRCKDzUVX5Rlm7ny0a
    qVIh+qRUT7XGHFnDznl7s1gEkcxLtuMnKBV7Ic2jVTzKluZZFJD5H2plP7flpVu/
    byvpBNguERFDC2mbnlX7TSRhhWjlYTgFS2KiFP1OjYjim6vemszobDsCg2gRs0Mh
    A7XwsVvPSFNfnAOPTpyLRGtN3ShEA0LKjBkjg2u67MPAfg1y8/8Tm3h/kqfOciMT
    5ax2J1Ll/9/oCWX9qW6gNmnnUGNlBpcAZk3pzh6n1coRnVaysoCPYPgd9u1KoBkt
    uTQJOn1Qi3OWPZzyiLGRa/X0tGx/5QQDnLr6GyDjwPcC09sCAwEAAaNTMFEwHQYD
    VR0OBBYEFNNvhlhHAsJtBZejHystlPa/CoP2MB8GA1UdIwQYMBaAFNNvhlhHAsJt
    BZejHystlPa/CoP2MA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEB
    AHQpVBYGfFPFTRY/HvtWdXROgW358m9rUC5E4SfTJ8JLWpCB4J+hfjQ+mASTFb1V
    5FS8in8S/u1MgeU65RC1/xt6Rof7Tu/Cx2SusPWo0YGyN0E9mwr2c91JsIgDO03Y
    gtDiavyw3tAPVo5n2U3y5Hf46bfT5TLZ2yFnUJcKRZ0CeX6YAJA5dwG182xOn02r
    kkh9T1bO72pQHi15QxnQ9Gc4Mi5gjuxX4/Xyag5KyEXnniTb7XquW+JKP36RfhnU
    DGoEEUNU5UYwIzh910NM0UZubu5Umya1JVumoDqAi1lf2DHhKwDNAhmozYqE1vJJ
    +e1C9/9oqok3CRyLDe+VJ7M=
    -----END CERTIFICATE-----
  id: 3c43a3a1-c47a-4806-b8f3-fb03b55c7cfa
  tags: ~
  key: |
    -----BEGIN PRIVATE KEY-----
    MIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQC6+3r1yt5y8Vvf
    JVw1lHMjUQig81FV+UZZu58tGqlSIfqkVE+1xhxZw855e7NYBJHMS7bjJygVeyHN
    o1U8ypbmWRSQ+R9qZT+35aVbv28r6QTYLhERQwtpm55V+00kYYVo5WE4BUtiohT9
    To2I4pur3prM6Gw7AoNoEbNDIQO18LFbz0hTX5wDj06ci0RrTd0oRANCyowZI4Nr
    uuzDwH4NcvP/E5t4f5KnznIjE+WsdidS5f/f6All/aluoDZp51BjZQaXAGZN6c4e
    p9XKEZ1WsrKAj2D4HfbtSqAZLbk0CTp9UItzlj2c8oixkWv19LRsf+UEA5y6+hsg
    48D3AtPbAgMBAAECggEBALoFVt8RZR2VYYEu+f2UIrgP9jWp3FFcHdFIB6Qn0iwU
    AfdaqbJ91da4JsJVfqciZKqK6Pg0DHzSc17SEArawiWImh1guxBuimW54jjUyxU0
    Tc2EhxZVTRVT7MI9sRFws/kXuxCws7784UTg0Y5NY/IpFHinAoXyiikO8vjl73sg
    trN5mQGNTE/c8lEs7pUAFWX9zuNbmV0m1q25lHDgbkAD76/9X26lLCK1A5e2iCj3
    MME6/2GlSy3hrtSY7mCiR1GktvnK+yidXXJSkGMNCSopQARfcAlMvcCDav5ODxTz
    mB+A47oxGKBTdc9gGF44dR15y5E1kRAvTtaAIzpc14ECgYEA4u9uZkZS0gEiiA5K
    pOm/lnBp6bloGg9RlsOO5waE8DiGZgkwWuDwsncxUB1SvLd28MgxZzNQClncS98J
    viJzdAVzauMpn3Iqrdtk9drGzEeuxibic1FKMf1URGwKnlcsDHaeKAGyRQgO2Q7l
    Oy7EwtRmUKBUA3RCIqLSoiEi6NcCgYEA0u4a2abgYdyR1QMavgevqCGhuqu1Aa2Y
    rbD3TmIfGVubI2YZeFSyhC/7Jx+5HofQj5cpMRgASxzKXqrCXuyb+Q+u23kHogfQ
    cO1yO2GzjlA3FVHTK28t9EDPTOgHWQt3q7iS1s44VHwXDOpEQJ2onKKohvcP5WTf
    LO0T2K9NOJ0CgYEAtX9nHXc6/+iWdJhxjKnCaBBqNNrrboQ37ctj/FOTeQjMPMk2
    mkhzWVjI4NlC9doJz5NdJ7u7VTv/W9L7WMz256EAaUlbXcGSbtAcVCFwg6sFFke9
    Lxuhqo+AmOSMLY1sll88KKUKrfk+3szx+z5xcZ0sY2mHJ+gQiOEOc0rrP6sCgYBi
    Ksi6RU0mnoYMki5PBLq+0DA59ZH/XvCw3ayrgUUiAx1XwzvVYe3XUZFc6wm36NOr
    EFnubFIuow6YMnbVwN7yclcZ8+EWivZ6qDfC5Tyw3ipUtMlH7K2BgOw5yb8ptQmU
    FQnaCQ30W/BKZXkwbW+8voMalT+DroejnA7hiOyyjQKBgFLi6x6w76fTgQ7Ts8x0
    eATLOrvdvfotuLyMSsQLbljXyJznCTNrOGfYTua/Ifgkn4LpnoOkkxvVbj/Eugc7
    WeXBG+gbEi25GZUktrZWP1uc6s8aXH6rjYJP8iXnUpFHmQAPGuGiFnfB5MxlSns9
    9SKBXe7AvKGknGf7zg8WLKJZ
    -----END PRIVATE KEY-----
snis:
- created_at: 1604424834
  id: 2c16a2da-0ec3-4bbd-b947-740008eff39b
  tags: ~
  name: example.test
  certificate: 3c43a3a1-c47a-4806-b8f3-fb03b55c7cfa
services:
- url: http://httpbin.org/anything
  routes:
  - paths:
    - /
  1. start kong:
$ kong start -v --conf dbless.conf
  1. On one terminal window start reloading:
$ for i in {1..1000}; do ./bin/kong reload -v --conf dbless.conf; done
  1. On second terminal window start making sslscan's:

    $ for i in {1..1000}; do sslscan --sni-name=example.test 127.0.0.1:8443 | grep Subject; done
  2. Check output of sslscan:

Subject:  kong_clustering
Subject:  kong_clustering
Subject:  kong_clustering
Subject:  kong_clustering
Subject:  kong_clustering
Subject:  kong_clustering
Subject:  kong_clustering
Subject:  kong_clustering
Subject:  localhost
Subject:  localhost
Subject:  kong_clustering
Subject:  kong_clustering

It looks like for a short period of time, Kong will return the wrong cert on reload. In my testing it returns back to normal. But this is unexpected still.

bungle commented 3 years ago

This should be fixed on upcoming Kong 2.3.0 as the #6661 got just merged.

morningspace commented 2 years ago

@bungle Happen to come across here due to seeing similar issue when running Kong. The only difference is that I'm running Kong in a k8s cluster in DBLess mode.

2022/08/29 10:47:36 [alert] 28#0: *148953 ignoring stale global SSL error (SSL: error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding error:04067072:rsa routines:rsa_ossl_public_decrypt:padding check failed), client: x.x.x.x, server: kong, request: "GET /x/y/z HTTP/1.1", host: "x.y.z"

Any suggestion on how to troubleshoot in this case? BTW: We are using Kong 2.5.0.

yasra002 commented 2 years ago

We also have the same issue while running Kong on k8s in DBLess mode. We are using Kong 2.7.0.

Any suggestions on how to troubleshoot and fix it?

parth-aggarwal commented 9 months ago

Hi @yasra002 @morningspace, Were you able to upload ssl certificate on kong deployed in dbless mode in K8s cluster?? i am facing the same issue