envoyproxy / envoy

Cloud-native high-performance edge/middle/service proxy
https://www.envoyproxy.io
Apache License 2.0
24.85k stars 4.78k forks source link

Error with gRPC python server upstream with SSL/TLS enabled. #4291

Closed jerome-quere closed 6 years ago

jerome-quere commented 6 years ago

I Cannot make a gRPC call to an upstream python server with SSL/TLS enabled.


                      Unsecure                           SSL GRPC
+-----------------+   Call  OK    +-----------------+    CALL  KO   +-----------------+
| Py / Go  client +--------------->      Envoy      +--------------->  Python server  |
+-------+---------+               +-----------------+               +---------^-------+
        |                                                                     |
        |                                                                     |
        +---------------------------------------------------------------------+
                                    SSL GRPC call OK

When calling python server directly with a python or go gRPC client it works fine. When using envoy to forward the gRPC call, client receive a gRPC unavailable error. When reading logs we have a successfull TLS handshake between envoy and python server.

Repro steps: I made demo repo here: https://github.com/jerome-quere/ssl-grpc-python-server-with-envoy.

Admin and Stats Output:

cluster.python-server.bind_errors: 0
cluster.python-server.external.upstream_rq_503: 1
cluster.python-server.external.upstream_rq_5xx: 1
cluster.python-server.http2.header_overflow: 0
cluster.python-server.http2.headers_cb_no_stream: 0
cluster.python-server.http2.rx_messaging_error: 0
cluster.python-server.http2.rx_reset: 0
cluster.python-server.http2.too_many_header_frames: 0
cluster.python-server.http2.trailers: 0
cluster.python-server.http2.tx_reset: 0
cluster.python-server.lb_healthy_panic: 0
cluster.python-server.lb_local_cluster_not_ok: 0
cluster.python-server.lb_recalculate_zone_structures: 0
cluster.python-server.lb_subsets_active: 0
cluster.python-server.lb_subsets_created: 0
cluster.python-server.lb_subsets_fallback: 0
cluster.python-server.lb_subsets_removed: 0
cluster.python-server.lb_subsets_selected: 0
cluster.python-server.lb_zone_cluster_too_small: 0
cluster.python-server.lb_zone_no_capacity_left: 0
cluster.python-server.lb_zone_number_differs: 0
cluster.python-server.lb_zone_routing_all_directly: 0
cluster.python-server.lb_zone_routing_cross_zone: 0
cluster.python-server.lb_zone_routing_sampled: 0
cluster.python-server.max_host_weight: 1
cluster.python-server.membership_change: 1
cluster.python-server.membership_healthy: 1
cluster.python-server.membership_total: 1
cluster.python-server.original_dst_host_invalid: 0
cluster.python-server.retry_or_shadow_abandoned: 0
cluster.python-server.ssl.ciphers.ECDHE-RSA-AES128-GCM-SHA256: 1
cluster.python-server.ssl.connection_error: 0
cluster.python-server.ssl.fail_verify_cert_hash: 0
cluster.python-server.ssl.fail_verify_error: 0
cluster.python-server.ssl.fail_verify_no_cert: 0
cluster.python-server.ssl.fail_verify_san: 0
cluster.python-server.ssl.handshake: 1
cluster.python-server.ssl.no_certificate: 0
cluster.python-server.ssl.session_reused: 0
cluster.python-server.update_attempt: 4
cluster.python-server.update_empty: 0
cluster.python-server.update_failure: 0
cluster.python-server.update_no_rebuild: 0
cluster.python-server.update_success: 4
cluster.python-server.upstream_cx_active: 0
cluster.python-server.upstream_cx_close_notify: 0
cluster.python-server.upstream_cx_connect_attempts_exceeded: 0
cluster.python-server.upstream_cx_connect_fail: 0
cluster.python-server.upstream_cx_connect_timeout: 0
cluster.python-server.upstream_cx_destroy: 0
cluster.python-server.upstream_cx_destroy_local: 0
cluster.python-server.upstream_cx_destroy_local_with_active_rq: 0
cluster.python-server.upstream_cx_destroy_remote: 0
cluster.python-server.upstream_cx_destroy_remote_with_active_rq: 1
cluster.python-server.upstream_cx_destroy_with_active_rq: 1
cluster.python-server.upstream_cx_http1_total: 0
cluster.python-server.upstream_cx_http2_total: 1
cluster.python-server.upstream_cx_idle_timeout: 0
cluster.python-server.upstream_cx_max_requests: 0
cluster.python-server.upstream_cx_none_healthy: 0
cluster.python-server.upstream_cx_overflow: 0
cluster.python-server.upstream_cx_protocol_error: 0
cluster.python-server.upstream_cx_rx_bytes_buffered: 0
cluster.python-server.upstream_cx_rx_bytes_total: 0
cluster.python-server.upstream_cx_total: 1
cluster.python-server.upstream_cx_tx_bytes_buffered: 0
cluster.python-server.upstream_cx_tx_bytes_total: 307
cluster.python-server.upstream_flow_control_backed_up_total: 0
cluster.python-server.upstream_flow_control_drained_total: 0
cluster.python-server.upstream_flow_control_paused_reading_total: 0
cluster.python-server.upstream_flow_control_resumed_reading_total: 0
cluster.python-server.upstream_rq_503: 1
cluster.python-server.upstream_rq_5xx: 1
cluster.python-server.upstream_rq_active: 0
cluster.python-server.upstream_rq_cancelled: 0
cluster.python-server.upstream_rq_maintenance_mode: 0
cluster.python-server.upstream_rq_pending_active: 0
cluster.python-server.upstream_rq_pending_failure_eject: 1
cluster.python-server.upstream_rq_pending_overflow: 0
cluster.python-server.upstream_rq_pending_total: 0
cluster.python-server.upstream_rq_per_try_timeout: 0
cluster.python-server.upstream_rq_retry: 0
cluster.python-server.upstream_rq_retry_overflow: 0
cluster.python-server.upstream_rq_retry_success: 0
cluster.python-server.upstream_rq_rx_reset: 0
cluster.python-server.upstream_rq_timeout: 0
cluster.python-server.upstream_rq_total: 1
cluster.python-server.upstream_rq_tx_reset: 0
cluster.python-server.version: 0
cluster_manager.active_clusters: 1
cluster_manager.cluster_added: 1
cluster_manager.cluster_modified: 0
cluster_manager.cluster_removed: 0
cluster_manager.warming_clusters: 0
filesystem.flushed_by_timer: 0
filesystem.reopen_failed: 0
filesystem.write_buffered: 1
filesystem.write_completed: 1
filesystem.write_total_buffered: 0
http.admin.downstream_cx_active: 2
http.admin.downstream_cx_destroy: 0
http.admin.downstream_cx_destroy_active_rq: 0
http.admin.downstream_cx_destroy_local: 0
http.admin.downstream_cx_destroy_local_active_rq: 0
http.admin.downstream_cx_destroy_remote: 0
http.admin.downstream_cx_destroy_remote_active_rq: 0
http.admin.downstream_cx_drain_close: 0
http.admin.downstream_cx_http1_active: 1
http.admin.downstream_cx_http1_total: 1
http.admin.downstream_cx_http2_active: 0
http.admin.downstream_cx_http2_total: 0
http.admin.downstream_cx_idle_timeout: 0
http.admin.downstream_cx_protocol_error: 0
http.admin.downstream_cx_rx_bytes_buffered: 452
http.admin.downstream_cx_rx_bytes_total: 866
http.admin.downstream_cx_ssl_active: 0
http.admin.downstream_cx_ssl_total: 0
http.admin.downstream_cx_total: 2
http.admin.downstream_cx_tx_bytes_buffered: 0
http.admin.downstream_cx_tx_bytes_total: 4757
http.admin.downstream_cx_websocket_active: 0
http.admin.downstream_cx_websocket_total: 0
http.admin.downstream_flow_control_paused_reading_total: 0
http.admin.downstream_flow_control_resumed_reading_total: 0
http.admin.downstream_rq_1xx: 0
http.admin.downstream_rq_2xx: 1
http.admin.downstream_rq_3xx: 0
http.admin.downstream_rq_4xx: 0
http.admin.downstream_rq_5xx: 0
http.admin.downstream_rq_active: 1
http.admin.downstream_rq_http1_total: 2
http.admin.downstream_rq_http2_total: 0
http.admin.downstream_rq_non_relative_path: 0
http.admin.downstream_rq_response_before_rq_complete: 0
http.admin.downstream_rq_rx_reset: 0
http.admin.downstream_rq_too_large: 0
http.admin.downstream_rq_total: 2
http.admin.downstream_rq_tx_reset: 0
http.admin.downstream_rq_ws_on_non_ws_route: 0
http.admin.rs_too_large: 0
http.async-client.no_cluster: 0
http.async-client.no_route: 0
http.async-client.rq_direct_response: 0
http.async-client.rq_redirect: 0
http.async-client.rq_total: 0
http.ingress_grpc.downstream_cx_active: 0
http.ingress_grpc.downstream_cx_destroy: 1
http.ingress_grpc.downstream_cx_destroy_active_rq: 0
http.ingress_grpc.downstream_cx_destroy_local: 0
http.ingress_grpc.downstream_cx_destroy_local_active_rq: 0
http.ingress_grpc.downstream_cx_destroy_remote: 1
http.ingress_grpc.downstream_cx_destroy_remote_active_rq: 0
http.ingress_grpc.downstream_cx_drain_close: 0
http.ingress_grpc.downstream_cx_http1_active: 0
http.ingress_grpc.downstream_cx_http1_total: 0
http.ingress_grpc.downstream_cx_http2_active: 0
http.ingress_grpc.downstream_cx_http2_total: 1
http.ingress_grpc.downstream_cx_idle_timeout: 0
http.ingress_grpc.downstream_cx_protocol_error: 0
http.ingress_grpc.downstream_cx_rx_bytes_buffered: 0
http.ingress_grpc.downstream_cx_rx_bytes_total: 399
http.ingress_grpc.downstream_cx_ssl_active: 0
http.ingress_grpc.downstream_cx_ssl_total: 0
http.ingress_grpc.downstream_cx_total: 1
http.ingress_grpc.downstream_cx_tx_bytes_buffered: 0
http.ingress_grpc.downstream_cx_tx_bytes_total: 171
http.ingress_grpc.downstream_cx_websocket_active: 0
http.ingress_grpc.downstream_cx_websocket_total: 0
http.ingress_grpc.downstream_flow_control_paused_reading_total: 0
http.ingress_grpc.downstream_flow_control_resumed_reading_total: 0
http.ingress_grpc.downstream_rq_1xx: 0
http.ingress_grpc.downstream_rq_2xx: 1
http.ingress_grpc.downstream_rq_3xx: 0
http.ingress_grpc.downstream_rq_4xx: 0
http.ingress_grpc.downstream_rq_5xx: 0
http.ingress_grpc.downstream_rq_active: 0
http.ingress_grpc.downstream_rq_http1_total: 0
http.ingress_grpc.downstream_rq_http2_total: 1
http.ingress_grpc.downstream_rq_non_relative_path: 0
http.ingress_grpc.downstream_rq_response_before_rq_complete: 0
http.ingress_grpc.downstream_rq_rx_reset: 0
http.ingress_grpc.downstream_rq_too_large: 0
http.ingress_grpc.downstream_rq_total: 1
http.ingress_grpc.downstream_rq_tx_reset: 0
http.ingress_grpc.downstream_rq_ws_on_non_ws_route: 0
http.ingress_grpc.no_cluster: 0
http.ingress_grpc.no_route: 0
http.ingress_grpc.rq_direct_response: 0
http.ingress_grpc.rq_redirect: 0
http.ingress_grpc.rq_total: 1
http.ingress_grpc.rs_too_large: 0
http.ingress_grpc.tracing.client_enabled: 0
http.ingress_grpc.tracing.health_check: 0
http.ingress_grpc.tracing.not_traceable: 0
http.ingress_grpc.tracing.random_sampling: 0
http.ingress_grpc.tracing.service_forced: 0
http2.header_overflow: 0
http2.headers_cb_no_stream: 0
http2.rx_messaging_error: 0
http2.rx_reset: 0
http2.too_many_header_frames: 0
http2.trailers: 0
http2.tx_reset: 0
listener.0.0.0.0_1337.downstream_cx_active: 0
listener.0.0.0.0_1337.downstream_cx_destroy: 1
listener.0.0.0.0_1337.downstream_cx_total: 1
listener.0.0.0.0_1337.http.ingress_grpc.downstream_rq_1xx: 0
listener.0.0.0.0_1337.http.ingress_grpc.downstream_rq_2xx: 1
listener.0.0.0.0_1337.http.ingress_grpc.downstream_rq_3xx: 0
listener.0.0.0.0_1337.http.ingress_grpc.downstream_rq_4xx: 0
listener.0.0.0.0_1337.http.ingress_grpc.downstream_rq_5xx: 0
listener.0.0.0.0_1337.no_filter_chain_match: 0
listener.admin.downstream_cx_active: 2
listener.admin.downstream_cx_destroy: 0
listener.admin.downstream_cx_total: 2
listener.admin.http.admin.downstream_rq_1xx: 0
listener.admin.http.admin.downstream_rq_2xx: 1
listener.admin.http.admin.downstream_rq_3xx: 0
listener.admin.http.admin.downstream_rq_4xx: 0
listener.admin.http.admin.downstream_rq_5xx: 0
listener.admin.no_filter_chain_match: 0
listener_manager.listener_added: 1
listener_manager.listener_create_failure: 0
listener_manager.listener_create_success: 4
listener_manager.listener_modified: 0
listener_manager.listener_removed: 0
listener_manager.total_listeners_active: 1
listener_manager.total_listeners_draining: 0
listener_manager.total_listeners_warming: 0
runtime.admin_overrides_active: 0
runtime.load_error: 0
runtime.load_success: 0
runtime.num_keys: 0
runtime.override_dir_exists: 0
runtime.override_dir_not_exists: 0
server.days_until_first_cert_expiring: 2147483647
server.hot_restart_epoch: 0
server.live: 1
server.memory_allocated: 3278288
server.memory_heap_size: 4194304
server.parent_connections: 0
server.total_connections: 0
server.uptime: 15
server.version: 4151803
server.watchdog_mega_miss: 0
server.watchdog_miss: 0
stats.overflow: 0
cluster.python-server.upstream_cx_connect_ms: P0(nan,7) P25(nan,7.025) P50(nan,7.05) P75(nan,7.075) P90(nan,7.09) P95(nan,7.095) P99(nan,7.099) P99.9(nan,7.0999) P100(nan,7.1)
cluster.python-server.upstream_cx_length_ms: P0(nan,7) P25(nan,7.025) P50(nan,7.05) P75(nan,7.075) P90(nan,7.09) P95(nan,7.095) P99(nan,7.099) P99.9(nan,7.0999) P100(nan,7.1)
http.admin.downstream_cx_length_ms: No recorded values
http.admin.downstream_rq_time: P0(3,3) P25(3.025,3.025) P50(3.05,3.05) P75(3.075,3.075) P90(3.09,3.09) P95(3.095,3.095) P99(3.099,3.099) P99.9(3.0999,3.0999) P100(3.1,3.1)
http.ingress_grpc.downstream_cx_length_ms: P0(nan,120) P25(nan,122.5) P50(nan,125) P75(nan,127.5) P90(nan,129) P95(nan,129.5) P99(nan,129.9) P99.9(nan,129.99) P100(nan,130)
http.ingress_grpc.downstream_rq_time: P0(nan,8) P25(nan,8.025) P50(nan,8.05) P75(nan,8.075) P90(nan,8.09) P95(nan,8.095) P99(nan,8.099) P99.9(nan,8.0999) P100(nan,8.1)
listener.0.0.0.0_1337.downstream_cx_length_ms: P0(nan,120) P25(nan,122.5) P50(nan,125) P75(nan,127.5) P90(nan,129) P95(nan,129.5) P99(nan,129.9) P99.9(nan,129.99) P100(nan,130)
listener.admin.downstream_cx_length_ms: No recorded values
python-server::default_priority::max_connections::1024
python-server::default_priority::max_pending_requests::1024
python-server::default_priority::max_requests::1024
python-server::default_priority::max_retries::3
python-server::high_priority::max_connections::1024
python-server::high_priority::max_pending_requests::1024
python-server::high_priority::max_requests::1024
python-server::high_priority::max_retries::3
python-server::added_via_api::false
python-server::172.24.0.2:1337::cx_active::0
python-server::172.24.0.2:1337::cx_connect_fail::0
python-server::172.24.0.2:1337::cx_total::1
python-server::172.24.0.2:1337::rq_active::0
python-server::172.24.0.2:1337::rq_error::1
python-server::172.24.0.2:1337::rq_success::0
python-server::172.24.0.2:1337::rq_timeout::0
python-server::172.24.0.2:1337::rq_total::1
python-server::172.24.0.2:1337::health_flags::healthy
python-server::172.24.0.2:1337::weight::1
python-server::172.24.0.2:1337::region::
python-server::172.24.0.2:1337::zone::
python-server::172.24.0.2:1337::sub_zone::
python-server::172.24.0.2:1337::canary::false
python-server::172.24.0.2:1337::success_rate::-1
envoy 3f59fb5c0f6554f8b3f2e73ab4c1437a63d42668/1.7.0/Clean/RELEASE live 106 106 0

Config:

admin:
  access_log_path: "/dev/null"
  address:
    socket_address:
      address: 0.0.0.0
      port_value: 8080

static_resources:
  listeners:
  - address:
      socket_address:
        address: 0.0.0.0
        port_value: 1337
    filter_chains:
    - filters:
      - name: envoy.http_connection_manager
        config:
          codec_type: http2
          stat_prefix: ingress_grpc
          route_config:
            virtual_hosts:
            - name: backend
              domains: ["*"]
              routes:
                match:
                  prefix: "/"
                route:
                  cluster: server
          http_filters:
          - name: envoy.router
            config: {}

  clusters:
  - name: server
    connect_timeout: 10s
    type: strict_dns
    lb_policy: round_robin
    http2_protocol_options: {}
    hosts:
    - socket_address:
        address: server
        port_value: 1337
    tls_context:
      common_tls_context:
        tls_certificates:

Logs:

[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:183] initializing epoch 0 (hot restart version=10.200.16384.127.options=capacity=16384, num_slots=8209 hash=228984379728933363 size=2654312)
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:185] statically linked extensions:
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:187]   access_loggers: envoy.file_access_log,envoy.http_grpc_access_log
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:190]   filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.filters.http.header_to_metadata,envoy.filters.http.jwt_authn,envoy.filters.http.rbac,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:193]   filters.listener: envoy.listener.original_dst,envoy.listener.proxy_protocol,envoy.listener.tls_inspector
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:196]   filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.filters.network.thrift_proxy,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:198]   stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.statsd
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:200]   tracers: envoy.dynamic.ot,envoy.lightstep,envoy.zipkin
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:203]   transport_sockets.downstream: envoy.transport_sockets.capture,raw_buffer,tls
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:206]   transport_sockets.upstream: envoy.transport_sockets.capture,raw_buffer,tls
[2018-08-28 09:42:39.909][1][debug][main] source/server/server.cc:233] admin address: 0.0.0.0:8080
[2018-08-28 09:42:39.910][1][info][config] source/server/configuration_impl.cc:50] loading 0 static secret(s)
[2018-08-28 09:42:39.910][10][debug][grpc] source/common/grpc/google_async_client_impl.cc:39] completionThread running
[2018-08-28 09:42:39.911][1][debug][upstream] source/common/upstream/cluster_manager_impl.cc:682] adding TLS initial cluster server
[2018-08-28 09:42:39.911][1][debug][upstream] source/common/upstream/upstream_impl.cc:866] starting async DNS resolution for server
[2018-08-28 09:42:39.911][1][debug][upstream] source/common/network/dns_impl.cc:147] Setting DNS resolution timer for 5000 milliseconds
[2018-08-28 09:42:39.911][1][debug][upstream] source/common/upstream/cluster_manager_impl.cc:60] cm init: adding: cluster=server primary=1 secondary=0
[2018-08-28 09:42:39.911][1][info][config] source/server/configuration_impl.cc:60] loading 1 listener(s)
[2018-08-28 09:42:39.911][1][debug][config] source/server/configuration_impl.cc:62] listener #0:
[2018-08-28 09:42:39.911][1][debug][config] source/server/listener_manager_impl.cc:511] begin add/update listener: name=35236eed-84dd-40d4-80a6-8812fe002922 hash=221110905671805882
[2018-08-28 09:42:39.911][1][debug][config] source/server/listener_manager_impl.cc:38]   filter #0:
[2018-08-28 09:42:39.911][1][debug][config] source/server/listener_manager_impl.cc:39]     name: envoy.http_connection_manager
[2018-08-28 09:42:39.911][1][debug][config] source/server/listener_manager_impl.cc:42]   config: {"http_filters":[{"config":{},"name":"envoy.router"}],"route_config":{"virtual_hosts":[{"name":"backend","domains":["*"],"routes":{"route":{"cluster":"server"},"match":{"prefix":"/"}}}]},"stat_prefix":"ingress_grpc","codec_type":"http2"}
[2018-08-28 09:42:39.912][1][debug][config] source/extensions/filters/network/http_connection_manager/config.cc:232]     filter #0
[2018-08-28 09:42:39.912][1][debug][config] source/extensions/filters/network/http_connection_manager/config.cc:233]       name: envoy.router
[2018-08-28 09:42:39.912][1][debug][config] source/extensions/filters/network/http_connection_manager/config.cc:237]     config: {}
[2018-08-28 09:42:39.912][1][debug][config] source/server/listener_manager_impl.cc:409] add active listener: name=35236eed-84dd-40d4-80a6-8812fe002922, hash=221110905671805882, address=0.0.0.0:1337
[2018-08-28 09:42:39.912][1][info][config] source/server/configuration_impl.cc:94] loading tracing configuration
[2018-08-28 09:42:39.912][1][info][config] source/server/configuration_impl.cc:116] loading stats sink configuration
[2018-08-28 09:42:39.912][1][info][main] source/server/server.cc:398] starting main dispatch loop
[2018-08-28 09:42:39.912][1][debug][upstream] source/common/network/dns_impl.cc:147] Setting DNS resolution timer for 4686 milliseconds
[2018-08-28 09:42:39.912][1][debug][upstream] source/common/upstream/upstream_impl.cc:873] async DNS resolution complete for server
[2018-08-28 09:42:39.913][1][debug][upstream] source/common/upstream/upstream_impl.cc:892] DNS hosts have changed for server
[2018-08-28 09:42:39.913][1][debug][upstream] source/common/upstream/cluster_manager_impl.cc:776] membership update for TLS cluster server
[2018-08-28 09:42:39.913][1][debug][upstream] source/common/upstream/cluster_manager_impl.cc:88] cm init: init complete: cluster=server primary=0 secondary=0
[2018-08-28 09:42:39.913][1][info][upstream] source/common/upstream/cluster_manager_impl.cc:132] cm init: all clusters initialized
[2018-08-28 09:42:39.913][1][info][main] source/server/server.cc:378] all clusters initialized. initializing init manager
[2018-08-28 09:42:39.913][1][info][config] source/server/listener_manager_impl.cc:781] all dependencies initialized. starting workers
[2018-08-28 09:42:39.913][13][debug][main] source/server/worker_impl.cc:91] worker entering dispatch loop
[2018-08-28 09:42:39.913][14][debug][main] source/server/worker_impl.cc:91] worker entering dispatch loop
[2018-08-28 09:42:39.913][12][debug][main] source/server/worker_impl.cc:91] worker entering dispatch loop
[2018-08-28 09:42:39.913][13][debug][upstream] source/common/upstream/cluster_manager_impl.cc:682] adding TLS initial cluster server
[2018-08-28 09:42:39.913][14][debug][upstream] source/common/upstream/cluster_manager_impl.cc:682] adding TLS initial cluster server
[2018-08-28 09:42:39.913][13][debug][upstream] source/common/upstream/cluster_manager_impl.cc:776] membership update for TLS cluster server
[2018-08-28 09:42:39.913][14][debug][upstream] source/common/upstream/cluster_manager_impl.cc:776] membership update for TLS cluster server
[2018-08-28 09:42:39.913][16][debug][grpc] source/common/grpc/google_async_client_impl.cc:39] completionThread running
[2018-08-28 09:42:39.913][12][debug][upstream] source/common/upstream/cluster_manager_impl.cc:682] adding TLS initial cluster server
[2018-08-28 09:42:39.913][12][debug][upstream] source/common/upstream/cluster_manager_impl.cc:776] membership update for TLS cluster server
[2018-08-28 09:42:39.913][15][debug][main] source/server/worker_impl.cc:91] worker entering dispatch loop
[2018-08-28 09:42:39.914][15][debug][upstream] source/common/upstream/cluster_manager_impl.cc:682] adding TLS initial cluster server
[2018-08-28 09:42:39.914][15][debug][upstream] source/common/upstream/cluster_manager_impl.cc:776] membership update for TLS cluster server
[2018-08-28 09:42:39.914][18][debug][grpc] source/common/grpc/google_async_client_impl.cc:39] completionThread running
[2018-08-28 09:42:39.914][17][debug][grpc] source/common/grpc/google_async_client_impl.cc:39] completionThread running
[2018-08-28 09:42:39.914][19][debug][grpc] source/common/grpc/google_async_client_impl.cc:39] completionThread running
jerome@Jeromes-MacBook-Pro:jquere/grpc-python-ssl (master) $> docker logs grpc-python-ssl_envoy_1                                                            11:42:42
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:183] initializing epoch 0 (hot restart version=10.200.16384.127.options=capacity=16384, num_slots=8209 hash=228984379728933363 size=2654312)
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:185] statically linked extensions:
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:187]   access_loggers: envoy.file_access_log,envoy.http_grpc_access_log
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:190]   filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.filters.http.header_to_metadata,envoy.filters.http.jwt_authn,envoy.filters.http.rbac,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:193]   filters.listener: envoy.listener.original_dst,envoy.listener.proxy_protocol,envoy.listener.tls_inspector
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:196]   filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.filters.network.thrift_proxy,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:198]   stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.statsd
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:200]   tracers: envoy.dynamic.ot,envoy.lightstep,envoy.zipkin
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:203]   transport_sockets.downstream: envoy.transport_sockets.capture,raw_buffer,tls
[2018-08-28 09:42:39.907][1][info][main] source/server/server.cc:206]   transport_sockets.upstream: envoy.transport_sockets.capture,raw_buffer,tls
[2018-08-28 09:42:39.909][1][debug][main] source/server/server.cc:233] admin address: 0.0.0.0:8080
[2018-08-28 09:42:39.910][1][info][config] source/server/configuration_impl.cc:50] loading 0 static secret(s)
[2018-08-28 09:42:39.910][10][debug][grpc] source/common/grpc/google_async_client_impl.cc:39] completionThread running
[2018-08-28 09:42:39.911][1][debug][upstream] source/common/upstream/cluster_manager_impl.cc:682] adding TLS initial cluster server
[2018-08-28 09:42:39.911][1][debug][upstream] source/common/upstream/upstream_impl.cc:866] starting async DNS resolution for server
[2018-08-28 09:42:39.911][1][debug][upstream] source/common/network/dns_impl.cc:147] Setting DNS resolution timer for 5000 milliseconds
[2018-08-28 09:42:39.911][1][debug][upstream] source/common/upstream/cluster_manager_impl.cc:60] cm init: adding: cluster=server primary=1 secondary=0
[2018-08-28 09:42:39.911][1][info][config] source/server/configuration_impl.cc:60] loading 1 listener(s)
[2018-08-28 09:42:39.911][1][debug][config] source/server/configuration_impl.cc:62] listener #0:
[2018-08-28 09:42:39.911][1][debug][config] source/server/listener_manager_impl.cc:511] begin add/update listener: name=35236eed-84dd-40d4-80a6-8812fe002922 hash=221110905671805882
[2018-08-28 09:42:39.911][1][debug][config] source/server/listener_manager_impl.cc:38]   filter #0:
[2018-08-28 09:42:39.911][1][debug][config] source/server/listener_manager_impl.cc:39]     name: envoy.http_connection_manager
[2018-08-28 09:42:39.911][1][debug][config] source/server/listener_manager_impl.cc:42]   config: {"http_filters":[{"config":{},"name":"envoy.router"}],"route_config":{"virtual_hosts":[{"name":"backend","domains":["*"],"routes":{"route":{"cluster":"server"},"match":{"prefix":"/"}}}]},"stat_prefix":"ingress_grpc","codec_type":"http2"}
[2018-08-28 09:42:39.912][1][debug][config] source/extensions/filters/network/http_connection_manager/config.cc:232]     filter #0
[2018-08-28 09:42:39.912][1][debug][config] source/extensions/filters/network/http_connection_manager/config.cc:233]       name: envoy.router
[2018-08-28 09:42:39.912][1][debug][config] source/extensions/filters/network/http_connection_manager/config.cc:237]     config: {}
[2018-08-28 09:42:39.912][1][debug][config] source/server/listener_manager_impl.cc:409] add active listener: name=35236eed-84dd-40d4-80a6-8812fe002922, hash=221110905671805882, address=0.0.0.0:1337
[2018-08-28 09:42:39.912][1][info][config] source/server/configuration_impl.cc:94] loading tracing configuration
[2018-08-28 09:42:39.912][1][info][config] source/server/configuration_impl.cc:116] loading stats sink configuration
[2018-08-28 09:42:39.912][1][info][main] source/server/server.cc:398] starting main dispatch loop
[2018-08-28 09:42:39.912][1][debug][upstream] source/common/network/dns_impl.cc:147] Setting DNS resolution timer for 4686 milliseconds
[2018-08-28 09:42:39.912][1][debug][upstream] source/common/upstream/upstream_impl.cc:873] async DNS resolution complete for server
[2018-08-28 09:42:39.913][1][debug][upstream] source/common/upstream/upstream_impl.cc:892] DNS hosts have changed for server
[2018-08-28 09:42:39.913][1][debug][upstream] source/common/upstream/cluster_manager_impl.cc:776] membership update for TLS cluster server
[2018-08-28 09:42:39.913][1][debug][upstream] source/common/upstream/cluster_manager_impl.cc:88] cm init: init complete: cluster=server primary=0 secondary=0
[2018-08-28 09:42:39.913][1][info][upstream] source/common/upstream/cluster_manager_impl.cc:132] cm init: all clusters initialized
[2018-08-28 09:42:39.913][1][info][main] source/server/server.cc:378] all clusters initialized. initializing init manager
[2018-08-28 09:42:39.913][1][info][config] source/server/listener_manager_impl.cc:781] all dependencies initialized. starting workers
[2018-08-28 09:42:39.913][13][debug][main] source/server/worker_impl.cc:91] worker entering dispatch loop
[2018-08-28 09:42:39.913][14][debug][main] source/server/worker_impl.cc:91] worker entering dispatch loop
[2018-08-28 09:42:39.913][12][debug][main] source/server/worker_impl.cc:91] worker entering dispatch loop
[2018-08-28 09:42:39.913][13][debug][upstream] source/common/upstream/cluster_manager_impl.cc:682] adding TLS initial cluster server
[2018-08-28 09:42:39.913][14][debug][upstream] source/common/upstream/cluster_manager_impl.cc:682] adding TLS initial cluster server
[2018-08-28 09:42:39.913][13][debug][upstream] source/common/upstream/cluster_manager_impl.cc:776] membership update for TLS cluster server
[2018-08-28 09:42:39.913][14][debug][upstream] source/common/upstream/cluster_manager_impl.cc:776] membership update for TLS cluster server
[2018-08-28 09:42:39.913][16][debug][grpc] source/common/grpc/google_async_client_impl.cc:39] completionThread running
[2018-08-28 09:42:39.913][12][debug][upstream] source/common/upstream/cluster_manager_impl.cc:682] adding TLS initial cluster server
[2018-08-28 09:42:39.913][12][debug][upstream] source/common/upstream/cluster_manager_impl.cc:776] membership update for TLS cluster server
[2018-08-28 09:42:39.913][15][debug][main] source/server/worker_impl.cc:91] worker entering dispatch loop
[2018-08-28 09:42:39.914][15][debug][upstream] source/common/upstream/cluster_manager_impl.cc:682] adding TLS initial cluster server
[2018-08-28 09:42:39.914][15][debug][upstream] source/common/upstream/cluster_manager_impl.cc:776] membership update for TLS cluster server
[2018-08-28 09:42:39.914][18][debug][grpc] source/common/grpc/google_async_client_impl.cc:39] completionThread running
[2018-08-28 09:42:39.914][17][debug][grpc] source/common/grpc/google_async_client_impl.cc:39] completionThread running
[2018-08-28 09:42:39.914][19][debug][grpc] source/common/grpc/google_async_client_impl.cc:39] completionThread running
[2018-08-28 09:42:44.906][1][debug][upstream] source/common/upstream/upstream_impl.cc:866] starting async DNS resolution for server
[2018-08-28 09:42:44.906][1][debug][upstream] source/common/network/dns_impl.cc:147] Setting DNS resolution timer for 3124 milliseconds
[2018-08-28 09:42:44.906][1][debug][main] source/server/server.cc:119] flushing stats
[2018-08-28 09:42:44.907][1][debug][upstream] source/common/network/dns_impl.cc:147] Setting DNS resolution timer for 3750 milliseconds
[2018-08-28 09:42:44.907][1][debug][upstream] source/common/upstream/upstream_impl.cc:873] async DNS resolution complete for server
[2018-08-28 09:42:47.774][14][debug][main] source/server/connection_handler_impl.cc:217] [C0] new connection
[2018-08-28 09:42:47.775][14][trace][connection] source/common/network/connection_impl.cc:389] [C0] socket event: 3
[2018-08-28 09:42:47.775][14][trace][connection] source/common/network/connection_impl.cc:457] [C0] write ready
[2018-08-28 09:42:47.775][14][trace][connection] source/common/network/connection_impl.cc:427] [C0] read ready
[2018-08-28 09:42:47.775][14][trace][connection] source/common/network/raw_buffer_socket.cc:21] [C0] read returns: 399
[2018-08-28 09:42:47.775][14][trace][connection] source/common/network/raw_buffer_socket.cc:21] [C0] read returns: -1
[2018-08-28 09:42:47.775][14][trace][connection] source/common/network/raw_buffer_socket.cc:29] [C0] read error: 11
[2018-08-28 09:42:47.775][14][debug][http2] source/common/http/http2/codec_impl.cc:621] [C0] setting stream-level initial window size to 268435456
[2018-08-28 09:42:47.775][14][debug][http2] source/common/http/http2/codec_impl.cc:643] [C0] updating connection-level initial window size to 268435456
[2018-08-28 09:42:47.775][14][trace][http2] source/common/http/http2/codec_impl.cc:277] [C0] dispatching 399 bytes
[2018-08-28 09:42:47.775][14][trace][http2] source/common/http/http2/codec_impl.cc:335] [C0] recv frame type=4
[2018-08-28 09:42:47.775][14][debug][http] source/common/http/conn_manager_impl.cc:187] [C0] new stream
[2018-08-28 09:42:47.775][14][trace][http2] source/common/http/http2/codec_impl.cc:335] [C0] recv frame type=1
[2018-08-28 09:42:47.775][14][debug][http] source/common/http/conn_manager_impl.cc:454] [C0][S8212982210066613568] request headers complete (end_stream=false):
':scheme', 'http'
':method', 'POST'
':authority', 'envoy:1337'
':path', '/Server/Foo'
'te', 'trailers'
'content-type', 'application/grpc'
'user-agent', 'grpc-python/1.14.1 grpc-c/6.0.0 (manylinux; chttp2; gladiolus)'
'grpc-accept-encoding', 'identity,deflate,gzip'
'accept-encoding', 'identity,gzip'

[2018-08-28 09:42:47.775][14][debug][router] source/common/router/router.cc:253] [C0][S8212982210066613568] cluster 'server' match for URL '/Server/Foo'
[2018-08-28 09:42:47.775][14][debug][router] source/common/router/router.cc:304] [C0][S8212982210066613568] router decoding headers:
':scheme', 'https'
':method', 'POST'
':authority', 'envoy:1337'
':path', '/Server/Foo'
'te', 'trailers'
'content-type', 'application/grpc'
'user-agent', 'grpc-python/1.14.1 grpc-c/6.0.0 (manylinux; chttp2; gladiolus)'
'grpc-accept-encoding', 'identity,deflate,gzip'
'accept-encoding', 'identity,gzip'
'x-forwarded-proto', 'http'
'x-request-id', '95b43fac-0f4e-4999-8adf-42e476a61ba1'
'x-envoy-expected-rq-timeout-ms', '15000'

[2018-08-28 09:42:47.775][14][debug][client] source/common/http/codec_client.cc:25] [C1] connecting
[2018-08-28 09:42:47.775][14][debug][connection] source/common/network/connection_impl.cc:569] [C1] connecting to 172.19.0.2:1337
[2018-08-28 09:42:47.775][14][debug][connection] source/common/network/connection_impl.cc:578] [C1] connection in progress
[2018-08-28 09:42:47.775][14][debug][http2] source/common/http/http2/codec_impl.cc:621] [C1] setting stream-level initial window size to 268435456
[2018-08-28 09:42:47.775][14][debug][http2] source/common/http/http2/codec_impl.cc:643] [C1] updating connection-level initial window size to 268435456
[2018-08-28 09:42:47.775][14][debug][pool] source/common/http/http2/conn_pool.cc:97] [C1] creating stream
[2018-08-28 09:42:47.775][14][debug][router] source/common/router/router.cc:972] [C0][S8212982210066613568] pool ready
[2018-08-28 09:42:47.775][14][trace][http2] source/common/http/http2/codec_impl.cc:492] [C1] send data: bytes=24
[2018-08-28 09:42:47.775][14][trace][connection] source/common/network/connection_impl.cc:326] [C1] writing 24 bytes, end_stream false
[2018-08-28 09:42:47.775][14][trace][http2] source/common/http/http2/codec_impl.cc:492] [C1] send data: bytes=21
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:326] [C1] writing 21 bytes, end_stream false
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:446] [C1] sent frame type=4
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:492] [C1] send data: bytes=13
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:326] [C1] writing 13 bytes, end_stream false
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:446] [C1] sent frame type=8
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:492] [C1] send data: bytes=235
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:326] [C1] writing 235 bytes, end_stream false
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:446] [C1] sent frame type=1
[2018-08-28 09:42:47.776][14][trace][http] source/common/http/conn_manager_impl.cc:660] [C0][S8212982210066613568] decode headers called: filter=0x336d6d0 status=1
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:335] [C0] recv frame type=8
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:335] [C0] recv frame type=0
[2018-08-28 09:42:47.776][14][debug][http] source/common/http/conn_manager_impl.cc:788] [C0][S8212982210066613568] request end stream
[2018-08-28 09:42:47.776][14][trace][router] source/common/router/router.cc:872] [C0][S8212982210066613568] proxying 5 bytes
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:326] [C1] writing 14 bytes, end_stream false
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:446] [C1] sent frame type=0
[2018-08-28 09:42:47.776][14][trace][http] source/common/http/conn_manager_impl.cc:722] [C0][S8212982210066613568] decode data called: filter=0x336d6d0 status=3
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:335] [C0] recv frame type=8
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:335] [C0] recv frame type=6
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:292] [C0] dispatched 399 bytes
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:492] [C0] send data: bytes=15
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:326] [C0] writing 15 bytes, end_stream false
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:446] [C0] sent frame type=4
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:492] [C0] send data: bytes=9
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:326] [C0] writing 9 bytes, end_stream false
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:446] [C0] sent frame type=4
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:492] [C0] send data: bytes=17
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:326] [C0] writing 17 bytes, end_stream false
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:446] [C0] sent frame type=6
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:492] [C0] send data: bytes=13
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:326] [C0] writing 13 bytes, end_stream false
[2018-08-28 09:42:47.776][14][trace][http2] source/common/http/http2/codec_impl.cc:446] [C0] sent frame type=8
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:389] [C0] socket event: 2
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:457] [C0] write ready
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/raw_buffer_socket.cc:63] [C0] write returns: 54
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:389] [C1] socket event: 2
[2018-08-28 09:42:47.776][14][trace][connection] source/common/network/connection_impl.cc:457] [C1] write ready
[2018-08-28 09:42:47.776][14][debug][connection] source/common/network/connection_impl.cc:466] [C1] connected
[2018-08-28 09:42:47.780][14][debug][connection] source/common/ssl/ssl_socket.cc:99] [C1] handshake complete
[2018-08-28 09:42:47.780][14][debug][client] source/common/http/codec_client.cc:63] [C1] connected
[2018-08-28 09:42:47.780][14][trace][connection] source/common/network/connection_impl.cc:457] [C1] write ready
[2018-08-28 09:42:47.780][14][trace][connection] source/common/ssl/ssl_socket.cc:172] [C1] ssl write returns: 307
[2018-08-28 09:42:47.780][14][trace][connection] source/common/network/connection_impl.cc:389] [C1] socket event: 3
[2018-08-28 09:42:47.780][14][trace][connection] source/common/network/connection_impl.cc:457] [C1] write ready
[2018-08-28 09:42:47.780][14][trace][connection] source/common/network/connection_impl.cc:427] [C1] read ready
[2018-08-28 09:42:47.780][14][trace][connection] source/common/ssl/ssl_socket.cc:57] [C1] ssl read returns: 0
[2018-08-28 09:42:47.780][14][debug][connection] source/common/network/connection_impl.cc:451] [C1] remote close
[2018-08-28 09:42:47.780][14][debug][connection] source/common/network/connection_impl.cc:133] [C1] closing socket: 0
[2018-08-28 09:42:47.780][14][debug][connection] source/common/ssl/ssl_socket.cc:208] [C1] SSL shutdown: rc=-1
[2018-08-28 09:42:47.780][14][debug][client] source/common/http/codec_client.cc:81] [C1] disconnect. resetting 1 pending requests
[2018-08-28 09:42:47.780][14][debug][client] source/common/http/codec_client.cc:104] [C1] request reset
[2018-08-28 09:42:47.780][14][trace][main] source/common/event/dispatcher_impl.cc:126] item added to deferred deletion list (size=1)
[2018-08-28 09:42:47.780][14][debug][pool] source/common/http/http2/conn_pool.cc:189] [C1] destroying stream: 0 remaining
[2018-08-28 09:42:47.780][14][debug][router] source/common/router/router.cc:458] [C0][S8212982210066613568] upstream reset
[2018-08-28 09:42:47.780][14][debug][http] source/common/http/conn_manager_impl.cc:974] [C0][S8212982210066613568] encoding headers via codec (end_stream=true):
':status', '200'
'content-type', 'application/grpc'
'grpc-status', '14'
'grpc-message', 'upstream connect error or disconnect/reset before headers'
'date', 'Tue, 28 Aug 2018 09:42:47 GMT'
'server', 'envoy'

[2018-08-28 09:42:47.780][14][trace][http2] source/common/http/http2/codec_impl.cc:492] [C0] send data: bytes=117
[2018-08-28 09:42:47.780][14][trace][connection] source/common/network/connection_impl.cc:326] [C0] writing 117 bytes, end_stream false
[2018-08-28 09:42:47.780][14][trace][http2] source/common/http/http2/codec_impl.cc:446] [C0] sent frame type=1
[2018-08-28 09:42:47.780][14][debug][http2] source/common/http/http2/codec_impl.cc:501] [C0] stream closed: 0
[2018-08-28 09:42:47.780][14][trace][main] source/common/event/dispatcher_impl.cc:126] item added to deferred deletion list (size=2)
[2018-08-28 09:42:47.780][14][trace][main] source/common/event/dispatcher_impl.cc:126] item added to deferred deletion list (size=3)
[2018-08-28 09:42:47.780][14][debug][pool] source/common/http/http2/conn_pool.cc:125] [C1] destroying primary client
[2018-08-28 09:42:47.780][14][trace][main] source/common/event/dispatcher_impl.cc:126] item added to deferred deletion list (size=4)
[2018-08-28 09:42:47.780][14][trace][main] source/common/event/dispatcher_impl.cc:52] clearing deferred deletion list (size=4)
[2018-08-28 09:42:47.780][14][trace][connection] source/common/network/connection_impl.cc:389] [C0] socket event: 2
[2018-08-28 09:42:47.780][14][trace][connection] source/common/network/connection_impl.cc:457] [C0] write ready
[2018-08-28 09:42:47.780][14][trace][connection] source/common/network/raw_buffer_socket.cc:63] [C0] write returns: 117
[2018-08-28 09:42:47.907][14][trace][connection] source/common/network/connection_impl.cc:389] [C0] socket event: 3
[2018-08-28 09:42:47.907][14][trace][connection] source/common/network/connection_impl.cc:457] [C0] write ready
[2018-08-28 09:42:47.907][14][trace][connection] source/common/network/connection_impl.cc:427] [C0] read ready
[2018-08-28 09:42:47.907][14][trace][connection] source/common/network/raw_buffer_socket.cc:21] [C0] read returns: 0
[2018-08-28 09:42:47.907][14][debug][connection] source/common/network/connection_impl.cc:451] [C0] remote close
[2018-08-28 09:42:47.907][14][debug][connection] source/common/network/connection_impl.cc:133] [C0] closing socket: 0
[2018-08-28 09:42:47.907][14][debug][main] source/server/connection_handler_impl.cc:50] [C0] adding to cleanup list
[2018-08-28 09:42:47.907][14][trace][main] source/common/event/dispatcher_impl.cc:126] item added to deferred deletion list (size=1)
[2018-08-28 09:42:47.907][14][trace][main] source/common/event/dispatcher_impl.cc:52] clearing deferred deletion list (size=1)
lizan commented 6 years ago

Do you have any logs from the gRPC server output? What is the version of gRPC you are using?

jerome-quere commented 6 years ago

Ok I found the problem. Python gRPC server requires ALPN (Golang gRPC server doesn't). Enabling it my envoy cluster solved my problem.

clusters:
  - name: python-server
    connect_timeout: 10s
    type: strict_dns
    lb_policy: round_robin
    http2_protocol_options: {}
    hosts:
    - socket_address:
        address: python-server
        port_value: 1337
    tls_context:
      common_tls_context:
        tls_certificates:
        alpn_protocols: ["h2"] <====== Add this.

For the record i was using latest version of python grpc grpcio-1.14.1

Python Server log with GRPC_VERBOSITY: debug and GRPC_TRACE: all

I0830 11:45:40.329341562      11 ev_epoll1_linux.cc:706]     PS:0x5642bfc17730 BEGIN_STARTS:0x7f24966ebf40
I0830 11:45:40.329359398      11 ev_epoll1_linux.cc:725]     PS:0x5642bfc17730 BEGIN_REORG:0x7f24966ebf40 kick_state=UNKICKED is_reassigning=1
I0830 11:45:40.329362347      11 ev_epoll1_linux.cc:794]     PS:0x5642bfc17730 BEGIN_DONE:0x7f24966ebf40 kick_state=DESIGNATED_POLLER shutdown=0 kicked_without_poller: 0
I0830 11:45:40.511268247      11 ev_epoll1_linux.cc:686]     ps: 0x5642bfc17730 poll got 1 events
E0830 11:45:40.511326739      11 lockfree_event.cc:209]      LockfreeEvent::SetReady: 0x5642bfb7ade8 curr=0x5642bfb61788
I0830 11:45:40.511331160      11 ev_epoll1_linux.cc:886]     PS:0x5642bfc17730 END_WORKER:0x7f24966ebf40
I0830 11:45:40.511333700      11 ev_epoll1_linux.cc:866]      .. mark pollset 0x5642bfc17730 inactive
I0830 11:45:40.511349176      11 tcp_server_posix.cc:226]    SERVER_CONNECT: incoming connection: ipv4:172.18.0.2:59988
I0830 11:45:40.511352177      11 ev_posix.cc:202]            (fd-trace) fd_create(7, tcp-server-connection:ipv4:172.18.0.2:59988, 0)
I0830 11:45:40.511417569      11 handshaker.cc:141]          handshake_manager 0x7f248c001b70: adding handshaker security [0x7f248c00b3a0] at index 0
I0830 11:45:40.511442951      11 timer_generic.cc:365]       TIMER 0x7f248c001bf0: SET 122616 now 2616 call 0x7f248c001c20[0x7f249ba8dd60]
I0830 11:45:40.511445558      11 timer_generic.cc:401]         .. add to shard 0 with queue_deadline_cap=3002 => is_first_timer=false
I0830 11:45:40.511450895      11 handshaker.cc:212]          handshake_manager 0x7f248c001b70: error="No Error" shutdown=0 index=0, args={endpoint=0x7f248c001720, args=0x7f248c00b7a0 {size=2: grpc.server_credentials=0x5642bfba3c70, grpc.security_connector=0x5642bfbaf090}, read_buffer=0x7f248c00b850 (length=0), exit_early=0}
I0830 11:45:40.511467852      11 handshaker.cc:258]          handshake_manager 0x7f248c001b70: calling handshaker security [0x7f248c00b3a0] at index 0
I0830 11:45:40.511471476      11 tcp_posix.cc:217]           TCP:0x7f248c001720 notify_on_read
E0830 11:45:40.511486951      11 lockfree_event.cc:94]       LockfreeEvent::NotifyOn: 0x7f248c001408 curr=(nil) closure=0x7f248c0018c0
E0830 11:45:40.511493550      11 lockfree_event.cc:94]       LockfreeEvent::NotifyOn: 0x5642bfb7ade8 curr=(nil) closure=0x5642bfb61788
I0830 11:45:40.511514550      11 ev_epoll1_linux.cc:948]      .. remove worker
I0830 11:45:40.511517313      11 ev_epoll1_linux.cc:706]     PS:0x5642bfc17730 BEGIN_STARTS:0x7f24966ebf40
I0830 11:45:40.511520489      11 ev_epoll1_linux.cc:725]     PS:0x5642bfc17730 BEGIN_REORG:0x7f24966ebf40 kick_state=UNKICKED is_reassigning=1
I0830 11:45:40.511523737      11 ev_epoll1_linux.cc:794]     PS:0x5642bfc17730 BEGIN_DONE:0x7f24966ebf40 kick_state=DESIGNATED_POLLER shutdown=0 kicked_without_poller: 0
I0830 11:45:40.511526901      11 ev_epoll1_linux.cc:686]     ps: 0x5642bfc17730 poll got 1 events
E0830 11:45:40.511528808      11 lockfree_event.cc:209]      LockfreeEvent::SetReady: 0x7f248c001408 curr=0x7f248c0018c0
E0830 11:45:40.511530818      11 lockfree_event.cc:209]      LockfreeEvent::SetReady: 0x7f248c001410 curr=(nil)
I0830 11:45:40.511532762      11 ev_epoll1_linux.cc:886]     PS:0x5642bfc17730 END_WORKER:0x7f24966ebf40
I0830 11:45:40.511534655      11 ev_epoll1_linux.cc:866]      .. mark pollset 0x5642bfc17730 inactive
I0830 11:45:40.511549945      11 tcp_posix.cc:479]           TCP:0x7f248c001720 got_read: "No Error"
I0830 11:45:40.511551767      11 tcp_posix.cc:464]           TCP:0x7f248c001720 alloc_slices
I0830 11:45:40.511553861      11 resource_quota.cc:795]      RQ anonymous_pool_7f248c001580 ipv4:172.18.0.2:59988: alloc 8192; free_pool -> -8192
I0830 11:45:40.511556387      11 resource_quota.cc:292]      RQ: check allocation for user 0x7f248c0019a0 shutdown=0 free_pool=-8192
I0830 11:45:40.511559245      11 resource_quota.cc:318]      RQ anonymous_pool_7f248c001580 ipv4:172.18.0.2:59988: grant alloc 8192 bytes; rq_free_pool -> 9223372036854767615
I0830 11:45:40.511582160      11 tcp_posix.cc:446]           TCP:0x7f248c001720 read_allocation_done: "No Error"
I0830 11:45:40.511587333      11 tcp_posix.cc:357]           TCP:0x7f248c001720 call_cb 0x7f248c00b570 0x7f249bb04dc0:0x7f248c00b3a0
I0830 11:45:40.511589474      11 tcp_posix.cc:360]           read: error="No Error"
I0830 11:45:40.511594779      11 tcp_posix.cc:365]           READ 0x7f248c001720 (peer=ipv4:172.18.0.2:59988): 16 03 01 00 7e 01 00 00 7a 03 03 3d d0 4d f4 97 55 c9 7e 23 6d f6 02 e4 8d 63 15 56 42 d7 a4 46 33 6b 03 68 6c 53 b3 55 60 26 37 00 00 1c c0 2b cc a9 c0 2f cc a8 c0 09 c0 13 00 9c 00 2f c0 2c c0 30 c0 0a c0 14 00 9d 00 35 01 00 00 35 ff 01 00 01 00 00 17 00 00 00 23 00 00 00 0d 00 14 00 12 04 03 08 04 04 01 05 03 08 05 05 01 08 06 06 01 02 01 00 0b 00 02 01 00 00 0a 00 06 00 04 00 1d 00 17 '....~...z..=.M..U.~#m....c.VB..F3k.hlS.U`&7....+.../........./.,.0.......5...5..........#..........................................'
I0830 11:45:40.511600514      11 ssl_transport_security.cc:211]      HANDSHAKE START -        TLS server start_accept  - !!!!!!
I0830 11:45:40.511603163      11 ssl_transport_security.cc:211]                 LOOP -   TLS server read_client_hello  - !!!!!!
I0830 11:45:40.511613483      11 ssl_transport_security.cc:211]                 LOOP -  TLS server select_certificate  - !!!!!!
I0830 11:45:40.511650751      11 ssl_transport_security.cc:211]                 LOOP -   TLS server select_parameters  - !!!!!!
I0830 11:45:40.511678562      11 ssl_transport_security.cc:211]                 LOOP -   TLS server send_server_hello  - !!!!!!
I0830 11:45:40.511700040      11 ssl_transport_security.cc:211]                 LOOP - TLS server send_server_certifi  - !!!!!!
I0830 11:45:40.511931938      11 ssl_transport_security.cc:211]                 LOOP - TLS server send_server_key_exc  - !!!!!!
I0830 11:45:40.516167849      11 ssl_transport_security.cc:211]                 LOOP - TLS server send_server_hello_d  - !!!!!!
I0830 11:45:40.516216991      11 ssl_transport_security.cc:211]                 LOOP - TLS server read_client_certifi  - !!!!!!
I0830 11:45:40.516220729      11 ssl_transport_security.cc:211]                 LOOP - TLS server verify_client_certi  - !!!!!!
I0830 11:45:40.516227695      11 ssl_transport_security.cc:211]                 LOOP - TLS server read_client_key_exc  - !!!!!!
I0830 11:45:40.516276909      11 tcp_posix.cc:661]           WRITE 0x7f248c001720 (peer=ipv4:172.18.0.2:59988): 16 03 03 00 3f 02 00 00 3b 03 03 5b 87 d8 e4 9e f8 55 c6 ff d4 cc 71 18 96 c7 4d 9d 75 89 ef dd 8e e4 79 e8 09 b7 63 6e ce 75 18 00 c0 2f 00 00 13 ff 01 00 01 00 00 17 00 00 00 23 00 00 00 0b 00 02 01 00 16 03 03 02 d4 0b 00 02 d0 00 02 cd 00 02 ca 30 82 02 c6 30 82 01 ae 02 09 00 ea 17 09 bf fe 6d 20 e8 30 0d 06 09 2a 86 48 86 f7 0d 01 01 0b 05 00 30 25 31 0b 30 09 06 03 55 04 06 13 02 46 52 31 16 30 14 06 03 55 04 03 0c 0d 70 79 74 68 6f 6e 2d 73 65 72 76 65 72 30 1e 17 0d 31 38 30 38 32 39 31 33 30 33 31 35 5a 17 0d 31 39 30 38 32 39 31 33 30 33 31 35 5a 30 25 31 0b 30 09 06 03 55 04 06 13 02 46 52 31 16 30 14 06 03 55 04 03 0c 0d 70 79 74 68 6f 6e 2d 73 65 72 76 65 72 30 82 01 22 30 0d 06 09 2a 86 48 86 f7 0d 01 01 01 05 00 03 82 01 0f 00 30 82 01 0a 02 82 01 01 00 b4 18 56 8a 24 90 6f 4b 84 fc 92 6c aa 7b 1b b7 0b 48 e9 9b 93 49 49 1a 97 2a 00 78 a3 6d 32 37 68 14 64 21 73 02 d0 1e 92 62 46 1e 7b a1 52 03 83 b4 b8 e4 fe 3c b3 e7 78 fd 7f d2 07 ea 0e 2e 2a da e9 98 9f e9 24 1d 34 c9 95 f9 7c 61 ba 51 dd f4 7d fb 66 04 88 e8 6e 34 58 8d 57 d0 84 a1 15 66 21 bd 81 9d b8 29 b3 dd e6 1f cf 12 dc 07 be ac 9c 1e e2 15 07 2b 9b af de 79 f9 39 d0 9a 1f cc ea 7d e5 51 98 82 1d 21 b3 fd 7b 68 fc 83 96 33 42 05 c1 cf 84 2a da 43 63 3b ea ea 49 52 46 99 c7 46 ae 9f 97 73 eb a6 a0 79 92 33 62 d4 65 e6 bf 3e 98 3a 13 9a f6 b3 a3 cb 91 61 ff 2f 7c 71 8e ad 31 fb 27 82 73 62 1f db 5a cc 36 77 2c c4 75 bd 0e 5f 14 ea 5c 3d f5 07 de 8e 18 a3 70 e5 9f d8 4d ac a8 08 e4 9f 1b 0a 45 af dc af f7 ea 50 da 70 2f d8 c0 6f 65 2d 9e f9 f9 a8 f5 02 03 01 00 01 30 0d 06 09 2a 86 48 86 f7 0d 01 01 0b 05 00 03 82 01 01 00 33 a4 f1 f0 e6 4f 26 4c e1 67 28 d5 a6 15 dd a4 70 aa 10 1f b6 80 28 49 ad 12 09 2f d2 7c 16 96 da 2d fc 10 bc 30 4c eb 62 ce 96 f1 64 6d b5 2e 14 6e a9 de 13 85 31 d0 68 a0 3b 5a 92 d6 72 c7 dd 89 98 4c f4 fb b5 ba bd 87 ce 31 5a 25 36 25 00 86 4e ea e1 fd e8 55 72 ef b5 2f 2b ea c4 01 0b de 1e 3b 94 d6 bf d3 41 b5 b4 51 07 d0 66 cd a2 c5 76 d3 50 18 85 a8 1c c7 91 2a 6a 6b d4 e8 57 16 4b 5c 04 21 c7 76 13 dc 47 73 80 17 10 e2 b4 89 8e 84 dd 40 23 52 ce 46 42 57 a7 39 6e fc ac 02 15 f3 73 9a 94 e0 bd 71 b5 e0 5e 12 92 f0 7a 32 0f 34 62 8b 5f 10 ae 50 83 5e f4 21 ac 32 40 76 7b 03 12 92 b4 5a a2 27 29 0a a6 fd 4f aa 5c ae 81 b9 14 e9 e6 37 9a f2 0f 23 b2 92 9d eb f3 ff 44 b2 5b 59 47 1a e3 43 b5 c0 f4 ba 93 06 50 19 c8 dc 5f 77 cc 36 db 55 4b c8 c9 b0 86 ea 16 03 03 01 4d 0c 00 01 49 03 00 17 41 04 68 5e de 94 6c b3 e3 db 69 7c 18 6c 6a ed 07 5e c9 c3 36 11 e8 85 53 8b 74 a9 58 d8 98 44 37 92 fb 8e 3c f2 95 29 3f 64 08 32 14 83 cd 2c 4e 42 7d 57 36 3b b1 b8 0e dc 97 e5 f8 d4 88 a5 d3 e5 08 04 01 00 2f cb f1 66 13 b2 db 59 f7 48 85 31 40 42 98 d8 87 15 c5 84 86 19 73 c3 a2 9a 8c 6d 2b 64 8f 00 c0 f4 e4 1e d2 10 5e 2a 63 e2 92 5d 18 f6 79 91 6a d8 0f ac e5 58 41 2b 1e df 52 42 c1 b8 bf 50 f2 18 78 51 70 62 ba 6a 21 9e da a0 f8 b7 65 39 3f 1f 2d 69 de a0 9a 74 ae e1 29 7d 17 1d 9a f7 65 19 31 ce 29 ea ea 99 54 9b fc 85 e7 61 e1 4f d4 07 39 de bd ff 48 cc dd 05 08 4a 24 40 3a 90 02 00 1e 41 b5 57 0f 81 9f 7b a0 4d 20 99 2e 1d 22 22 64 97 87 30 85 a0 3a 53 74 2c 3a ff 12 8a 83 e1 1f 70 82 b3 ee c5 62 59 2c 56 85 87 6e 63 9c ed 83 b4 ac ac b2 70 07 b0 66 b5 5f b4 d7 48 8e fe a6 c2 68 10 f0 37 85 4c 3b 1c b4 13 1c f3 3e 38 60 fc 74 54 6e 72 5e 8c 36 a5 b5 de 71 c5 5b 0f 5f da 64 a3 5f 78 d0 1e 5a c8 16 22 78 77 e3 1a d7 41 ab 23 3b eb c9 13 7a 4c 96 5e d6 7e 16 03 03 00 04 0e 00 00 00 '....?...;..[.....U....q...M.u.....y...cn.u.../.............#.......................0...0...........m .0...*.H........0%1.0...U....FR1.0...U....python-server0...180829130315Z..190829130315Z0%1.0...U....FR1.0...U....python-server0.."0...*.H.............0..........V.$.oK...l.{...H...II..*.x.m27h.d!s....bF.{.R......<..x.......*.....$.4...|a.Q..}.f...n4X.W....f!....)...............+...y.9.....}.Q...!..{h...3B....*.Cc;..IRF..F...s...y.3b.e..>.:.......a./|q..1.'.sb..Z.6w,.u.._..\=......p...M.......E.....P.p/..oe-..........0...*.H.............3....O&L.g(.....p.....(I.../.|...-...0L.b...dm...n....1.h.;Z..r....L.......1Z%6%..N....Ur../+......;....A..Q..f...v.P......*jk..W.K\.!.v..Gs.........@#R.FBW.9n.....s....q..^...z2.4b._..P.^.!.2@v{....Z.')...O.\......7...#......D.[YG..C......P..._w.6.UK.........M...I...A.h^..l...i|.lj..^..6...S.t.X..D7...<..)?d.2...,NB}W6;................/..f...Y.H.1@B........s....m+d........^*c..]..y.j....XA+..RB...P..xQpb.j!.....e9?.-i...t..)}....e.1.)...T....a.O..9...H....J$@:....A.W...{.M ...""d..0..:St,:......p....bY,V..nc.......p..f._..H....h..7.L;.....>8`.tTnr^.6...q.[._.d._x..Z.."xw...A.#;...zL.^.~.........'
I0830 11:45:40.516353045      11 tcp_posix.cc:689]           write: "No Error"
I0830 11:45:40.516370837      11 tcp_posix.cc:479]           TCP:0x7f248c001720 got_read: "No Error"
I0830 11:45:40.516377073      11 tcp_posix.cc:464]           TCP:0x7f248c001720 alloc_slices
I0830 11:45:40.516379501      11 resource_quota.cc:795]      RQ anonymous_pool_7f248c001580 ipv4:172.18.0.2:59988: alloc 8192; free_pool -> -8192
I0830 11:45:40.516382427      11 resource_quota.cc:292]      RQ: check allocation for user 0x7f248c0019a0 shutdown=0 free_pool=-8192
I0830 11:45:40.516385023      11 resource_quota.cc:318]      RQ anonymous_pool_7f248c001580 ipv4:172.18.0.2:59988: grant alloc 8192 bytes; rq_free_pool -> 9223372036854759423
I0830 11:45:40.516387677      11 tcp_posix.cc:446]           TCP:0x7f248c001720 read_allocation_done: "No Error"
I0830 11:45:40.516391147      11 tcp_posix.cc:217]           TCP:0x7f248c001720 notify_on_read
E0830 11:45:40.516393471      11 lockfree_event.cc:94]       LockfreeEvent::NotifyOn: 0x7f248c001408 curr=(nil) closure=0x7f248c0018c0
I0830 11:45:40.516395612      11 ev_epoll1_linux.cc:948]      .. remove worker
I0830 11:45:40.516398623      11 ev_epoll1_linux.cc:706]     PS:0x5642bfc17730 BEGIN_STARTS:0x7f24966ebf40
I0830 11:45:40.516400878      11 ev_epoll1_linux.cc:725]     PS:0x5642bfc17730 BEGIN_REORG:0x7f24966ebf40 kick_state=UNKICKED is_reassigning=1
I0830 11:45:40.516453892      11 ev_epoll1_linux.cc:794]     PS:0x5642bfc17730 BEGIN_DONE:0x7f24966ebf40 kick_state=DESIGNATED_POLLER shutdown=0 kicked_without_poller: 0
I0830 11:45:40.517140502      11 ev_epoll1_linux.cc:686]     ps: 0x5642bfc17730 poll got 1 events
E0830 11:45:40.517199191      11 lockfree_event.cc:209]      LockfreeEvent::SetReady: 0x7f248c001408 curr=0x7f248c0018c0
E0830 11:45:40.517205680      11 lockfree_event.cc:209]      LockfreeEvent::SetReady: 0x7f248c001410 curr=0x2
I0830 11:45:40.517209532      11 ev_epoll1_linux.cc:886]     PS:0x5642bfc17730 END_WORKER:0x7f24966ebf40
I0830 11:45:40.517213209      11 ev_epoll1_linux.cc:866]      .. mark pollset 0x5642bfc17730 inactive
I0830 11:45:40.517216824      11 tcp_posix.cc:479]           TCP:0x7f248c001720 got_read: "No Error"
I0830 11:45:40.517219957      11 tcp_posix.cc:470]           TCP:0x7f248c001720 do_read
I0830 11:45:40.517226787      11 tcp_posix.cc:357]           TCP:0x7f248c001720 call_cb 0x7f248c00b570 0x7f249bb04dc0:0x7f248c00b3a0
I0830 11:45:40.517246080      11 tcp_posix.cc:360]           read: error="No Error"
I0830 11:45:40.517286569      11 tcp_posix.cc:365]           READ 0x7f248c001720 (peer=ipv4:172.18.0.2:59988): 16 03 03 00 46 10 00 00 42 41 04 99 78 ab c8 63 5d cb f8 5d e3 73 9a 0e d7 a7 9d fe 68 41 0e 0a 7b c1 04 84 22 3d 91 ae 4f 18 e8 a8 ba 72 2d 51 a1 1e b0 88 6a 77 09 81 02 6c 25 06 e3 5b 58 0e 93 27 38 2c 58 a9 eb 43 81 f3 6d 14 03 03 00 01 01 16 03 03 00 28 00 00 00 00 00 00 00 00 50 1f 76 6f 58 0e 38 e7 9b 94 b4 28 63 4e 28 9b 27 da 91 4f 40 ea 17 83 19 9c 73 59 17 e1 1c 51 '....F...BA..x..c]..].s......hA..{..."=..O....r-Q....jw...l%..[X..'8,X..C..m..........(........P.voX.8....(cN(.'..O@.....sY...Q'
I0830 11:45:40.518024812      11 ssl_transport_security.cc:211]                 LOOP - TLS server read_client_certifi  - !!!!!!
I0830 11:45:40.518047055      11 ssl_transport_security.cc:211]                 LOOP - TLS server read_change_cipher_  - !!!!!!
I0830 11:45:40.518050667      11 ssl_transport_security.cc:211]                 LOOP - TLS server process_change_ciph  - !!!!!!
I0830 11:45:40.518064950      11 ssl_transport_security.cc:211]                 LOOP -     TLS server read_next_proto  - !!!!!!
I0830 11:45:40.518067173      11 ssl_transport_security.cc:211]                 LOOP -     TLS server read_channel_id  - !!!!!!
I0830 11:45:40.518069975      11 ssl_transport_security.cc:211]                 LOOP - TLS server read_client_finishe  - !!!!!!
I0830 11:45:40.518083892      11 ssl_transport_security.cc:211]                 LOOP - TLS server send_server_finishe  - !!!!!!
I0830 11:45:40.518163479      11 ssl_transport_security.cc:211]                 LOOP - TLS server finish_server_hands  - !!!!!!
I0830 11:45:40.518187306      11 ssl_transport_security.cc:211]                 LOOP -                TLS server done  - !!!!!!
I0830 11:45:40.518191487      11 ssl_transport_security.cc:211]       HANDSHAKE DONE -                TLS server done  - !!!!!!
I0830 11:45:40.518227917      11 tcp_posix.cc:661]           WRITE 0x7f248c001720 (peer=ipv4:172.18.0.2:59988): 16 03 03 00 ba 04 00 00 b6 00 00 1c 20 00 b0 49 42 da bb 6f c7 e9 a2 c0 f3 20 43 f4 45 43 a4 79 a7 7e 7b 25 6a 91 ab cf 90 5c 3a 43 ba b6 00 80 0c ab 4f 11 2b 72 02 b4 d8 73 b6 0d 5f fb bd fa 8b d1 35 d9 67 f0 d8 20 95 b7 e9 6d f7 0f 79 1a 7b a1 d2 04 fa 38 38 90 3b e4 e2 06 cf 4b b3 e2 30 45 c9 b4 5f 93 6e 3e 89 76 32 13 04 61 f4 10 35 d8 33 7d 67 88 9c bf 8f 13 f7 b6 be b9 19 39 13 6b bc 09 9c e8 90 fb fd c9 44 c3 44 b2 b9 c5 b9 af 90 ea 50 3e de 95 49 0c 15 db d7 4b 54 d9 3b 6c bf f5 93 33 b5 4b 8f 14 3d 2f f1 63 d6 7f 0d 0a 30 61 63 15 fd 47 dd b5 4a 54 ad 24 cc 14 03 03 00 01 01 16 03 03 00 28 00 00 00 00 00 00 00 00 ea 59 95 25 c8 c7 61 90 09 b6 23 f4 81 d2 4e 69 af a3 c7 14 28 04 30 81 8c a9 8c 9d dc a2 f9 e3 '............ ..IB..o..... C.EC.y.~{%j....\:C......O.+r...s.._.....5.g.. ...m..y.{....88.;....K..0E.._.n>.v2..a..5.3}g..........9.k........D.D.......P>..I....KT.;l...3.K..=/.c....0ac..G..JT.$...........(.........Y.%..a...#...Ni....(.0.........'
I0830 11:45:40.518272268      11 tcp_posix.cc:689]           write: "No Error"
D0830 11:45:40.518293349      11 security_handshaker.cc:129] Security handshake failed: {"created":"@1535629540.518280971","description":"Cannot check peer: missing selected ALPN property.","file":"src/core/lib/security/security_connector/security_connector.cc","file_line":858}
I0830 11:45:40.518298546      11 ev_posix.cc:222]            (fd-trace) fd_shutdown(7)
E0830 11:45:40.518302651      11 lockfree_event.cc:160]      LockfreeEvent::SetShutdown: 0x7f248c001408 curr=(nil) err={"created":"@1535629540.518280971","description":"Cannot check peer: missing selected ALPN property.","file":"src/core/lib/security/security_connector/security_connector.cc","file_line":858}
E0830 11:45:40.518397792      11 lockfree_event.cc:160]      LockfreeEvent::SetShutdown: 0x7f248c001410 curr=0x2 err={"created":"@1535629540.518280971","description":"Cannot check peer: missing selected ALPN property.","file":"src/core/lib/security/security_connector/security_connector.cc","file_line":858}
E0830 11:45:40.518405480      11 lockfree_event.cc:160]      LockfreeEvent::SetShutdown: 0x7f248c001418 curr=(nil) err={"created":"@1535629540.518280971","description":"Cannot check peer: missing selected ALPN property.","file":"src/core/lib/security/security_connector/security_connector.cc","file_line":858}
I0830 11:45:40.518415053      11 handshaker.cc:212]          handshake_manager 0x7f248c001b70: error={"created":"@1535629540.518280971","description":"Cannot check peer: missing selected ALPN property.","file":"src/core/lib/security/security_connector/security_connector.cc","file_line":858} shutdown=0 index=1, args={endpoint=(nil), args=(nil) {size=0: (null)}, read_buffer=(nil) (length=0), exit_early=0}
I0830 11:45:40.518419084      11 handshaker.cc:245]          handshake_manager 0x7f248c001b70: handshaking complete -- scheduling on_handshake_done with error={"created":"@1535629540.518280971","description":"Cannot check peer: missing selected ALPN property.","file":"src/core/lib/security/security_connector/security_connector.cc","file_line":858}
I0830 11:45:40.518423287      11 timer_generic.cc:467]       TIMER 0x7f248c001bf0: CANCEL pending=true
D0830 11:45:40.518446824      11 chttp2_server.cc:113]       Handshaking failed: {"created":"@1535629540.518280971","description":"Cannot check peer: missing selected ALPN property.","file":"src/core/lib/security/security_connector/security_connector.cc","file_line":858}
I0830 11:45:40.518610739      11 resource_quota.cc:818]      RQ anonymous_pool_7f248c001580 ipv4:172.18.0.2:59988: free 8192; free_pool -> 8192
I0830 11:45:40.518683830      11 resource_quota.cc:818]      RQ anonymous_pool_7f248c001580 ipv4:172.18.0.2:59988: free 8192; free_pool -> 16384
I0830 11:45:40.518738141      11 ev_posix.cc:215]            (fd-trace) grpc_fd_orphan, fd:7 closed
I0830 11:45:40.518862216      11 resource_quota.cc:508]      RU shutdown 0x7f248c0019a0
I0830 11:45:40.518889534      11 ev_epoll1_linux.cc:948]      .. remove worker
I0830 11:45:40.518896317      11 ev_epoll1_linux.cc:706]     PS:0x5642bfc17730 BEGIN_STARTS:0x7f24966ebf40
I0830 11:45:40.518932748      11 ev_epoll1_linux.cc:725]     PS:0x5642bfc17730 BEGIN_REORG:0x7f24966ebf40 kick_state=UNKICKED is_reassigning=1
I0830 11:45:40.518962007      11 ev_epoll1_linux.cc:794]     PS:0x5642bfc17730 BEGIN_DONE:0x7f24966ebf40 kick_state=DESIGNATED_POLLER shutdown=0 kicked_without_poller: 0
I0830 11:45:40.530304519      11 ev_epoll1_linux.cc:686]     ps: 0x5642bfc17730 poll got 0 events
I0830 11:45:40.530344582      11 ev_epoll1_linux.cc:886]     PS:0x5642bfc17730 END_WORKER:0x7f24966ebf40
I0830 11:45:40.530348684      11 ev_epoll1_linux.cc:866]      .. mark pollset 0x5642bfc17730 inactive
I0830 11:45:40.530351431      11 ev_epoll1_linux.cc:948]      .. remove worker
I0830 11:45:40.530355612      11 completion_queue.cc:951]    RETURN_EVENT[0x5642bfc17660]: QUEUE_TIMEOUT
I0830 11:45:40.530364777      11 completion_queue.cc:851]    grpc_completion_queue_next(cq=0x5642bfc17660, deadline=gpr_timespec { tv_sec: 1535629540, tv_nsec: 730362056, clock_type: 1 }, reserved=(nil))
ivasonn commented 6 years ago

May be it similar question. After replacing go service with python (https://gist.github.com/ivasonn/4a45842af8307d0d3368c54fef13ec06) in grpc-bridge example. I'm getting follow in headers with empty response {'content-length': '0', 'grpc-message': 'upstream connect error or disconnect/reset before headers', 'server': 'envoy', 'x-envoy-upstream-service-time': '1', 'grpc-status': '14', 'date': 'Mon, 03 Sep 2018 18:48:08 GMT', 'content-type': 'application/grpc'} Is this my mistake or not? thank you in advance

lizan commented 6 years ago

@ivasonn do you have your envoy config? Does similar config for cluster with ALPN resolve your problem?

ivasonn commented 6 years ago

I did not change anything in https://github.com/envoyproxy/envoy/blob/master/examples/grpc-bridge/config/s2s-grpc-envoy.yaml. No, ALPN not fix my problem

lizan commented 6 years ago

@ivasonn that's a different issue because you're not using TLS, can you open a new issue with all information asked by issue template?

ivasonn commented 6 years ago

@lizan thanks. New issue - https://github.com/envoyproxy/envoy/issues/4342 and solution for this in comment

mmahimtura commented 5 years ago

Is my issue the same as this one? https://github.com/envoyproxy/envoy/issues/7738