Open zycforgithub opened 2 years ago
The first log shows that it does not establish a connection at all. Are you sure the server is running on the specified machine on port 443?
Yes, I'm sure. Let me show you another log with grpc 1.24.11. Their server is the same.
@murgatroid99
I0301 15:21:21.185661000 4590949888 call.cc:626] OP[client_idle:0x1100e17f0]: SEND_INITIAL_METADATA{key=3a 70 61 74 68 ':path' value=2f 6a 64 79 5f 6c 6f 67 5f 71 75 65 72 79 2e 4a 64 79 4c 6f 67 51 75 65 72 79 2f 46 69 6e 64 46 69 6c 65 53 74 61 74 73 4f 77 6e 65 72 42 79 4f 77 6e 65 72 49 64 '/jdy_log_query.JdyLogQuery/FindFileStatsOwnerByOwnerId'} RECV_INITIAL_METADATA
I0301 15:21:21.185680000 4590949888 channel_stack.cc:226] OP[client-channel:0x1100e1808]: SEND_INITIAL_METADATA{key=3a 70 61 74 68 ':path' value=2f 6a 64 79 5f 6c 6f 67 5f 71 75 65 72 79 2e 4a 64 79 4c 6f 67 51 75 65 72 79 2f 46 69 6e 64 46 69 6c 65 53 74 61 74 73 4f 77 6e 65 72 42 79 4f 77 6e 65 72 49 64 '/jdy_log_query.JdyLogQuery/FindFileStatsOwnerByOwnerId'} RECV_INITIAL_METADATA
I0301 15:21:21.185691000 4590949888 client_channel.cc:2249] chand=0x1100e8c60 calld=0x1100e1830: adding pending batch at index 0
I0301 15:21:21.185697000 4590949888 client_channel.cc:2075] chand=0x1100e8c60 calld=0x1100e1830: grabbing data plane mutex to perform pick
I0301 15:21:21.185704000 4590949888 connectivity_state.cc:79] CONWATCH: 0x1100e8d30 client_channel: get IDLE
I0301 15:21:21.185709000 4590949888 client_channel.cc:3699] chand=0x1100e8c60 calld=0x1100e1830: adding to queued picks list
I0301 15:21:21.186064000 4590949888 resolving_lb_policy.cc:196] resolving_lb=0x600006c6b180: starting name resolution
I0301 15:21:21.186318000 4590949888 client_channel.cc:1273] chand=0x1100e8c60: update: state=CONNECTING picker=0x600000892b20
I0301 15:21:21.186333000 4590949888 connectivity_state.cc:147] SET: 0x1100e8d30 client_channel: IDLE --> CONNECTING [helper]
I0301 15:21:21.186590000 4590949888 client_channel.cc:3847] chand=0x1100e8c60 calld=0x1100e1830: LB pick returned QUEUE (subchannel=0x0, error="No Error")
D0301 15:21:21.186599000 4590949888 dns_resolver.cc:242] Start resolving.
I0301 15:21:21.186794000 4590949888 client_channel.cc:1551] chand=0x1100e8c60: created resolving_lb_policy=0x600006c6b180
I0301 15:21:21.186830000 4590949888 metadata_array.cc:29] grpc_metadata_array_init(array=0x6000022f8968)
I0301 15:21:21.186945000 4590949888 call.cc:1967] grpc_call_start_batch(call=0x1100e0e60, ops=0x6000022f8910, nops=1, tag=0x6000011b6310, reserved=0x0)
I0301 15:21:21.186955000 4590949888 call.cc:1566] ops[0]: RECV_STATUS_ON_CLIENT metadata=0x6000022f8968 status=0x6000022f8980 details=0x6000022f8988
I0301 15:21:21.191421000 4590949888 call.cc:1967] grpc_call_start_batch(call=0x1100e0e60, ops=0x6000022f4aa0, nops=1, tag=0x6000011bac40, reserved=0x0)
I0301 15:21:21.191438000 4590949888 call.cc:1566] ops[0]: SEND_MESSAGE ptr=0x1080b2350
I0301 15:21:21.191642000 4590949888 call.cc:1967] grpc_call_start_batch(call=0x1100e0e60, ops=0x6000022f2990, nops=1, tag=0x6000011bad00, reserved=0x0)
I0301 15:21:21.191655000 4590949888 call.cc:1566] ops[0]: SEND_CLOSE_FROM_CLIENT
I0301 15:21:21.192066000 4590949888 call.cc:1967] grpc_call_start_batch(call=0x1100e0e60, ops=0x6000022f2bc0, nops=1, tag=0x6000011bae80, reserved=0x0)
I0301 15:21:21.192080000 4590949888 call.cc:1566] ops[0]: RECV_MESSAGE ptr=0x60000003dea8
I0301 15:21:21.192144000 4590949888 completion_queue.cc:960] grpc_completion_queue_next(cq=0x104a1c9d0, deadline=gpr_timespec { tv_sec: -9223372036854775808, tv_nsec: 0, clock_type: 0 }, reserved=0x0)
I0301 15:21:21.192157000 4590949888 completion_queue.cc:1060] RETURN_EVENT[0x104a1c9d0]: QUEUE_TIMEOUT
I0301 15:21:21.195835000 4590949888 ssl_credentials.cc:130] grpc_ssl_credentials_create(pem_root_certs=-----BEGIN CERTIFICATE-----
MIIDkzCCAnugAwIBAgIJAPd1meUMrKb4MA0GCSqGSIb3DQEBBQUAMGkxCzAJBgNV
BAYTAkNOMRAwDgYDVQQIDAdKaWFuZ3N1MQ0wCwYDVQQHDARXdVhpMRcwFQYDVQQK
DA5qaWFuZGFveXVuLmNvbTEgMB4GA1UEAwwXKi5ncnBjLmpkeS1pbnRlcm5hbC5j
b20wHhcNMjIwMjIyMTQxNjExWhcNMzIwMjIwMTQxNjExWjBpMQswCQYDVQQGEwJD
TjEQMA4GA1UECAwHSmlhbmdzdTENMAsGA1UEBwwEV3VYaTEXMBUGA1UECgwOamlh
bmRhb3l1bi5jb20xIDAeBgNVBAMMFyouZ3JwYy5qZHktaW50ZXJuYWwuY29tMIIB
IjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAoBky0znElYXJ++4G4fV7s59u
GfKytrw6+CeuAPVQD+SFHsWhFfD0sX3mdOcaSIFBI3sElpNf87hCzmJlhxQLEQTQ
5eOctl0isLc3J03egBE4sxEblE5rKsjpSLNwDoQFMpXGWY7U4VwNZ6fc9iWwis0h
xjn8+848MqoXWay8Y8vJqTNfWWwEAKNk7Qnf+pgYNuin+qavBugGWoxVXgje53kt
Vk9BUf5y8+atDcogeMlBSjNPYy0s3hZQ50SDBFB9j2ZtV/hO/HVh9k9R3Mz3BzEI
eixy4bwJtTB+mgsKHPfy1cjaNwX5ekQ+00HEUpUbRMZdHAoo9Vp0v+N3ukhXhQID
AQABoz4wPDAJBgNVHRMEAjAAMAsGA1UdDwQEAwIF4DAiBgNVHREEGzAZghcqLmdy
cGMuamR5LWludGVybmFsLmNvbTANBgkqhkiG9w0BAQUFAAOCAQEATQeZngXW2n75
wXkJ8AgBErLr7/iwEG2qCwUdwt0ieRWQ/nd4ggTzuy78Riym/IyndBHDqdMp2wwD
quWdeB9BMT1cwHQ5ZsBXnPaELmLuK55tVl3lYu4AgpmMXMWCmgSc9iNWQa18jLUT
9RWyzgi/RJnUiAqZSkVc1iamqZcHMimUJgR/anKkIoCPBeNZWX90KE/SF4gB3xvg
fDq0zqoVhOM5xIrXSjCsLoGWNhRa3Hu4R/ZosTCdw+KG5RU+ASUmGLtYHao4GuKG
/kgYuuSoHH2OJWvBQqT8Ky9lNcxEcmryzehlp6FKex4Fe8oibo9G5IituWle7DJP
saChpdc7cg==
-----END CERTIFICATE-----, pem_key_cert_pair=0x0, verify_options=0x7ff7bfeee460, reserved=0x0)
I0301 15:21:21.196523000 4590949888 secure_channel_create.cc:201] grpc_secure_channel_create(creds=0x600002701920, target=dev.test.com:443, args=0x600000007940, reserved=0x0)
I0301 15:21:21.196604000 4590949888 init.cc:164] grpc_init(void)
I0301 15:21:21.196669000 4590949888 client_idle_filter.cc:294] (client idle filter) created with max_leisure_time = 1800000 ms
I0301 15:21:21.196686000 4590949888 client_channel.cc:1367] chand=0x110916e60: creating client_channel for channel stack 0x110916c80
I0301 15:21:21.197010000 4590949888 client_idle_filter.cc:206] (client idle filter) call counter has increased to 1
I0301 15:21:21.197029000 4590949888 channel.cc:322] grpc_channel_get_target(channel=0x110916c00)
I0301 15:21:21.197098000 4590949888 metadata_array.cc:29] grpc_metadata_array_init(array=0x600000892d48)
I0301 15:21:21.197113000 4590949888 metadata_array.cc:29] grpc_metadata_array_init(array=0x600000892d28)
I0301 15:21:21.197281000 4590949888 call.cc:1967] grpc_call_start_batch(call=0x1108e6060, ops=0x60001335a800, nops=2, tag=0x6000011c4ff0, reserved=0x0)
I0301 15:21:21.197292000 4590949888 call.cc:1566] ops[0]: SEND_INITIAL_METADATA(nil)
I0301 15:21:21.197298000 4590949888 call.cc:1566] ops[1]: RECV_INITIAL_METADATA ptr=0x600000892d28
I0301 15:21:21.197312000 4590949888 call.cc:626] OP[client_idle:0x1108e69f0]: SEND_INITIAL_METADATA{key=3a 70 61 74 68 ':path' value=2f 6a 64 79 5f 6c 6f 67 5f 71 75 65 72 79 2e 4a 64 79 4c 6f 67 51 75 65 72 79 2f 46 69 6e 64 46 69 6c 65 53 74 61 74 73 41 70 70 42 79 4f 77 6e 65 72 49 64 '/jdy_log_query.JdyLogQuery/FindFileStatsAppByOwnerId'} RECV_INITIAL_METADATA
I0301 15:21:21.197321000 4590949888 channel_stack.cc:226] OP[client-channel:0x1108e6a08]: SEND_INITIAL_METADATA{key=3a 70 61 74 68 ':path' value=2f 6a 64 79 5f 6c 6f 67 5f 71 75 65 72 79 2e 4a 64 79 4c 6f 67 51 75 65 72 79 2f 46 69 6e 64 46 69 6c 65 53 74 61 74 73 41 70 70 42 79 4f 77 6e 65 72 49 64 '/jdy_log_query.JdyLogQuery/FindFileStatsAppByOwnerId'} RECV_INITIAL_METADATA
I0301 15:21:21.197365000 4590949888 client_channel.cc:2249] chand=0x110916e60 calld=0x1108e6a30: adding pending batch at index 0
I0301 15:21:21.197371000 4590949888 client_channel.cc:2075] chand=0x110916e60 calld=0x1108e6a30: grabbing data plane mutex to perform pick
I0301 15:21:21.197377000 4590949888 connectivity_state.cc:79] CONWATCH: 0x110916f30 client_channel: get IDLE
I0301 15:21:21.197382000 4590949888 client_channel.cc:3699] chand=0x110916e60 calld=0x1108e6a30: adding to queued picks list
I0301 15:21:21.197395000 4590949888 resolving_lb_policy.cc:196] resolving_lb=0x600006c54100: starting name resolution
I0301 15:21:21.197401000 4590949888 client_channel.cc:1273] chand=0x110916e60: update: state=CONNECTING picker=0x600000892ce0
I0301 15:21:21.197406000 4590949888 connectivity_state.cc:147] SET: 0x110916f30 client_channel: IDLE --> CONNECTING [helper]
I0301 15:21:21.197476000 4590949888 client_channel.cc:3847] chand=0x110916e60 calld=0x1108e6a30: LB pick returned QUEUE (subchannel=0x0, error="No Error")
D0301 15:21:21.197481000 4590949888 dns_resolver.cc:242] Start resolving.
I0301 15:21:21.197497000 4590949888 client_channel.cc:1551] chand=0x110916e60: created resolving_lb_policy=0x600006c54100
I0301 15:21:21.197537000 4590949888 metadata_array.cc:29] grpc_metadata_array_init(array=0x6000022ed688)
I0301 15:21:21.197648000 4590949888 call.cc:1967] grpc_call_start_batch(call=0x1108e6060, ops=0x6000022ed630, nops=1, tag=0x6000011c51a0, reserved=0x0)
I0301 15:21:21.197657000 4590949888 call.cc:1566] ops[0]: RECV_STATUS_ON_CLIENT metadata=0x6000022ed688 status=0x6000022ed6a0 details=0x6000022ed6a8
I0301 15:21:21.199631000 4590949888 call.cc:1967] grpc_call_start_batch(call=0x1108e6060, ops=0x6000022f8910, nops=1, tag=0x6000011b63a0, reserved=0x0)
I0301 15:21:21.199657000 4590949888 call.cc:1566] ops[0]: SEND_MESSAGE ptr=0x104cf54f0
I0301 15:21:21.199796000 4590949888 call.cc:1967] grpc_call_start_batch(call=0x1108e6060, ops=0x6000022f8910, nops=1, tag=0x6000011b5e60, reserved=0x0)
I0301 15:21:21.199806000 4590949888 call.cc:1566] ops[0]: SEND_CLOSE_FROM_CLIENT
I0301 15:21:21.200106000 4590949888 call.cc:1967] grpc_call_start_batch(call=0x1108e6060, ops=0x6000022f8aa0, nops=1, tag=0x6000011b6490, reserved=0x0)
I0301 15:21:21.200120000 4590949888 call.cc:1566] ops[0]: RECV_MESSAGE ptr=0x600000044f28
I0301 15:21:21.202771000 4590949888 resolving_lb_policy.cc:441] resolving_lb=0x600006c6b180: got resolver result
I0301 15:21:21.202794000 4590949888 client_channel.cc:3712] chand=0x1100e8c60 calld=0x1100e1830: applying service config to call
I0301 15:21:21.202802000 4590949888 resolving_lb_policy.cc:339] resolving_lb=0x600006c6b180: Creating new child policy pick_first
I0301 15:21:21.203110000 4590949888 pick_first.cc:146] Pick First 0x1081c5630 created.
I0301 15:21:21.203130000 4590949888 resolving_lb_policy.cc:393] resolving_lb=0x600006c6b180: created new LB policy "pick_first" (0x1081c5630)
I0301 15:21:21.203137000 4590949888 resolving_lb_policy.cc:356] resolving_lb=0x600006c6b180: Updating child policy 0x1081c5630
I0301 15:21:21.203143000 4590949888 pick_first.cc:265] Pick First 0x1081c5630 received update with 1 addresses
I0301 15:21:21.203481000 4590949888 subchannel_list.h:370] [pick_first 0x1081c5630] Creating subchannel list 0x104ad9060 for 1 subchannels
I0301 15:21:21.205343000 4590949888 client_channel.cc:831] chand=0x1100e8c60: creating subchannel wrapper 0x6000022e7480 for subchannel 0x1081c6760
I0301 15:21:21.205361000 4590949888 subchannel_list.h:420] [pick_first 0x1081c5630] subchannel list 0x104ad9060 index 0: Created subchannel 0x6000022e7480 for address uri ipv4:172.128.2.1:443
I0301 15:21:21.205369000 4590949888 subchannel_list.h:316] [pick_first 0x1081c5630] subchannel list 0x104ad9060 index 0 of 1 (subchannel 0x6000022e7480): starting watch (from IDLE)
I0301 15:21:21.205380000 4590949888 client_channel.cc:992] chand=0x1100e8c60: connectivity change for subchannel wrapper 0x6000022e7480 subchannel 0x1081c6760 (connected_subchannel=0x0 state=CONNECTING); hopping into combiner
I0301 15:21:21.205393000 4590949888 tcp_client_custom.cc:151] CLIENT_CONNECT: 0x6000011cc930 ipv4:172.128.2.1:443: asynchronously connecting
I0301 15:21:21.205529000 4590949888 client_channel.cc:1040] chand=0x1100e8c60: processing connectivity change in combiner for subchannel wrapper 0x6000022e7480 subchannel 0x1081c6760 (connected_subchannel=0x0 state=CONNECTING): watcher=0x60000089a180
I0301 15:21:21.205539000 4590949888 subchannel_list.h:249] [pick_first 0x1081c5630] subchannel list 0x104ad9060 index 0 of 1 (subchannel 0x6000022e7480): connectivity changed: state=CONNECTING, shutting_down=0, pending_watcher=0x60000089a180
I0301 15:21:21.205545000 4590949888 client_channel.cc:1273] chand=0x1100e8c60: update: state=CONNECTING picker=0x60000089a2e0
I0301 15:21:21.205551000 4590949888 connectivity_state.cc:147] SET: 0x1100e8d30 client_channel: CONNECTING --> CONNECTING [helper]
I0301 15:21:21.205558000 4590949888 client_channel.cc:3847] chand=0x1100e8c60 calld=0x1100e1830: LB pick returned QUEUE (subchannel=0x0, error="No Error")
I0301 15:21:21.205569000 4590949888 resolving_lb_policy.cc:441] resolving_lb=0x600006c54100: got resolver result
I0301 15:21:21.205574000 4590949888 client_channel.cc:3712] chand=0x110916e60 calld=0x1108e6a30: applying service config to call
I0301 15:21:21.205580000 4590949888 resolving_lb_policy.cc:339] resolving_lb=0x600006c54100: Creating new child policy pick_first
I0301 15:21:21.205586000 4590949888 pick_first.cc:146] Pick First 0x1081c7210 created.
I0301 15:21:21.205591000 4590949888 resolving_lb_policy.cc:393] resolving_lb=0x600006c54100: created new LB policy "pick_first" (0x1081c7210)
I0301 15:21:21.205703000 4590949888 resolving_lb_policy.cc:356] resolving_lb=0x600006c54100: Updating child policy 0x1081c7210
I0301 15:21:21.205714000 4590949888 pick_first.cc:265] Pick First 0x1081c7210 received update with 1 addresses
I0301 15:21:21.205724000 4590949888 subchannel_list.h:370] [pick_first 0x1081c7210] Creating subchannel list 0x104cf5630 for 1 subchannels
I0301 15:21:21.206109000 4590949888 client_channel.cc:831] chand=0x110916e60: creating subchannel wrapper 0x6000022e7660 for subchannel 0x1081c7320
I0301 15:21:21.206124000 4590949888 subchannel_list.h:420] [pick_first 0x1081c7210] subchannel list 0x104cf5630 index 0: Created subchannel 0x6000022e7660 for address uri ipv4:172.128.2.1:443
I0301 15:21:21.206131000 4590949888 subchannel_list.h:316] [pick_first 0x1081c7210] subchannel list 0x104cf5630 index 0 of 1 (subchannel 0x6000022e7660): starting watch (from IDLE)
I0301 15:21:21.206138000 4590949888 client_channel.cc:992] chand=0x110916e60: connectivity change for subchannel wrapper 0x6000022e7660 subchannel 0x1081c7320 (connected_subchannel=0x0 state=CONNECTING); hopping into combiner
I0301 15:21:21.206149000 4590949888 tcp_client_custom.cc:151] CLIENT_CONNECT: 0x6000011ccba0 ipv4:172.128.2.1:443: asynchronously connecting
I0301 15:21:21.206232000 4590949888 client_channel.cc:1040] chand=0x110916e60: processing connectivity change in combiner for subchannel wrapper 0x6000022e7660 subchannel 0x1081c7320 (connected_subchannel=0x0 state=CONNECTING): watcher=0x60000089a520
I0301 15:21:21.206241000 4590949888 subchannel_list.h:249] [pick_first 0x1081c7210] subchannel list 0x104cf5630 index 0 of 1 (subchannel 0x6000022e7660): connectivity changed: state=CONNECTING, shutting_down=0, pending_watcher=0x60000089a520
I0301 15:21:21.206247000 4590949888 client_channel.cc:1273] chand=0x110916e60: update: state=CONNECTING picker=0x60000089a4e0
I0301 15:21:21.206252000 4590949888 connectivity_state.cc:147] SET: 0x110916f30 client_channel: CONNECTING --> CONNECTING [helper]
I0301 15:21:21.206258000 4590949888 client_channel.cc:3847] chand=0x110916e60 calld=0x1108e6a30: LB pick returned QUEUE (subchannel=0x0, error="No Error")
I0301 15:21:21.206329000 4590949888 completion_queue.cc:960] grpc_completion_queue_next(cq=0x104a1c9d0, deadline=gpr_timespec { tv_sec: -9223372036854775808, tv_nsec: 0, clock_type: 0 }, reserved=0x0)
I0301 15:21:21.206389000 4590949888 completion_queue.cc:1060] RETURN_EVENT[0x104a1c9d0]: QUEUE_TIMEOUT
I0301 15:21:21.208089000 4590949888 tcp_custom.cc:353] Creating TCP endpoint 0x6000011cc930
I0301 15:21:21.208100000 4590949888 tcp_client_custom.cc:69] CLIENT_CONNECT: ipv4:172.128.2.1:443: on_alarm: error="Cancelled"
I0301 15:21:21.208347000 4590949888 handshaker.cc:100] handshake_manager 0x1081c81e0: adding handshaker http_connect [0x1091caa00] at index 0
I0301 15:21:21.208803000 4590949888 ssl_transport_security.cc:217] HANDSHAKE START - before SSL initialization - PINIT
I0301 15:21:21.208824000 4590949888 ssl_transport_security.cc:217] LOOP - before SSL initialization - PINIT
I0301 15:21:21.208853000 4590949888 ssl_transport_security.cc:217] LOOP - SSLv3/TLS write client hello - TWCH
I0301 15:21:21.208864000 4590949888 handshaker.cc:100] handshake_manager 0x1081c81e0: adding handshaker security [0x104ad9510] at index 1
I0301 15:21:21.208882000 4590949888 handshaker.cc:131] handshake_manager 0x1081c81e0: error="No Error" shutdown=0 index=0, args={endpoint=0x60000b782700, args=0x600000007940 {size=10: grpc.ssl_target_name_override=test.grpc.internal.com, grpc.primary_user_agent=grpc-node/1.24.11, grpc.client_channel_factory=0x600000044a80, grpc.channel_credentials=0x6000026f8d80, grpc.server_uri=dns:///dev.test.com:443, grpc.default_authority=test.grpc.internal.com, grpc.subchannel_pool=0x600002612a60, grpc.http2_scheme=https, grpc.security_connector=0x6000022e7340, grpc.subchannel_address=ipv4:172.128.2.1:443}, read_buffer=0x104ad98a0 (length=0), exit_early=0}
I0301 15:21:21.208890000 4590949888 handshaker.cc:178] handshake_manager 0x1081c81e0: calling handshaker http_connect [0x1091caa00] at index 0
I0301 15:21:21.208901000 4590949888 handshaker.cc:131] handshake_manager 0x1081c81e0: error="No Error" shutdown=0 index=1, args={endpoint=0x60000b782700, args=0x600000007940 {size=10: grpc.ssl_target_name_override=test.grpc.internal.com, grpc.primary_user_agent=grpc-node/1.24.11, grpc.client_channel_factory=0x600000044a80, grpc.channel_credentials=0x6000026f8d80, grpc.server_uri=dns:///dev.test.com:443, grpc.default_authority=test.grpc.internal.com, grpc.subchannel_pool=0x600002612a60, grpc.http2_scheme=https, grpc.security_connector=0x6000022e7340, grpc.subchannel_address=ipv4:172.128.2.1:443}, read_buffer=0x104ad98a0 (length=0), exit_early=0}
I0301 15:21:21.208936000 4590949888 handshaker.cc:178] handshake_manager 0x1081c81e0: calling handshaker security [0x104ad9510] at index 1
I0301 15:21:21.208950000 4590949888 tcp_custom.cc:235] WRITE 0x6000011cc930 (peer=ipv4:172.128.2.1:443): 16 03 01 00 ce 01 00 00 ca 03 03 a6 8f ca 69 57 70 31 7f 0f 8e 52 ea 76 f8 f0 08 5c 05 89 18 76 5b d3 14 4f 99 31 de 41 8f 52 8a 00 00 0a c0 2b c0 2c c0 2f c0 30 00 ff 01 00 00 97 00 00 00 2d 00 2b 00 00 28 66 78 2d 6c 6f 67 2d 73 65 72 76 65 72 2d 74 65 73 74 2e 67 72 70 63 2e 6a 64 79 2d 69 6e 74 65 72 6e 61 6c 2e 63 6f 6d 00 0b 00 04 03 00 01 02 00 0a 00 04 00 02 00 17 00 23 00 00 33 74 00 00 00 10 00 0e 00 0c 08 67 72 70 63 2d 65 78 70 02 68 32 00 16 00 00 00 17 00 00 00 0d 00 30 00 2e 04 03 05 03 06 03 08 07 08 08 08 09 08 0a 08 0b 08 04 08 05 08 06 04 01 05 01 06 01 03 03 02 03 03 01 02 01 03 02 02 02 04 02 05 02 06 02 '..............iWp1...R.v...\...v[..O.1.A.R.....+.,./.0.........-.+..(test.grpc.internal.com.................#..3t.........grpc-exp.h2...........0................................................'
I0301 15:21:21.209169000 4590949888 tcp_custom.cc:218] write complete on 0x6000011cc930: error="No Error"
The first log shows that the client is connecting to 172.128.1.2
and the last log shows that the client is connecting to 172.128.2.1
(the last two components of the address are different). Also, that last log appears to only get as far as the first one. It never seems to finish connecting. From reading these two logs, I would say that grpc-js
is behaving the same as grpc
.
Sry, my bad. This is a clerical error, they are not real ip. In the log, I changed the real ip, they represent the same server.
In the last log, I cut out some lines because there are too many. And I found something interesting. This is the log with grpc-js 1.5.7 and Node 12.16.2, I got some error in console
D 2022-03-02T01:57:08.543Z | call_stream | [1] Sending metadata
D 2022-03-02T01:57:08.545Z | channel | (2) dns:dev.test.com:443 Pick result: QUEUE subchannel: undefined status: undefined undefined
D 2022-03-02T01:57:08.547Z | channel | (2) dns:dev.test.com:443 callRefTimer.ref | configSelectionQueue.length=0 pickQueue.length=1
D 2022-03-02T01:57:08.549Z | call_stream | [1] write() called with message of length 40
D 2022-03-02T01:57:08.552Z | call_stream | [1] end() called
D 2022-03-02T01:57:08.554Z | call_stream | [1] deferring writing data chunk of length 45
D 2022-03-02T01:57:08.556Z | pick_first | Connect to address list dev.test.com:443
D 2022-03-02T01:57:08.559Z | subchannel | (5) dev.test.com:443 Subchannel constructed with options {
"grpc.ssl_target_name_override": "test.grpc.internal.com”,
"grpc.default_authority": "test.grpc.internal.com"
}
D 2022-03-02T01:57:08.561Z | subchannel_refcount | (5) 172.128.1.2:443 refcount 0 -> 1
D 2022-03-02T01:57:08.563Z | subchannel_refcount | (5) 172.128.1.2:443 refcount 1 -> 2
D 2022-03-02T01:57:08.566Z | pick_first | Start connecting to subchannel with address 172.128.1.2:443
D 2022-03-02T01:57:08.568Z | pick_first | IDLE -> CONNECTING
D 2022-03-02T01:57:08.570Z | resolving_load_balancer | dns:dev.test.com:443 IDLE -> CONNECTING
D 2022-03-02T01:57:08.572Z | channel | (2) dns:dev.test.com:443 callRefTimer.unref | configSelectionQueue.length=0 pickQueue.length=0
D 2022-03-02T01:57:08.574Z | channel | (2) dns:dev.test.com:443 Pick result: QUEUE subchannel: undefined status: undefined undefined
D 2022-03-02T01:57:08.577Z | channel | (2) dns:dev.test.com:443 callRefTimer.ref | configSelectionQueue.length=0 pickQueue.length=1
D 2022-03-02T01:57:08.580Z | connectivity_state | (2) dns:dev.test.com:443 IDLE -> CONNECTING
D 2022-03-02T01:57:08.582Z | subchannel | (5) 172.128.1.2:443 IDLE -> CONNECTING
D 2022-03-02T01:57:08.585Z | pick_first | CONNECTING -> CONNECTING
D 2022-03-02T01:57:08.587Z | resolving_load_balancer | dns:dev.test.com:443 CONNECTING -> CONNECTING
D 2022-03-02T01:57:08.589Z | channel | (2) dns:dev.test.com:443 callRefTimer.unref | configSelectionQueue.length=0 pickQueue.length=0
D 2022-03-02T01:57:08.591Z | channel | (2) dns:dev.test.com:443 Pick result: QUEUE subchannel: undefined status: undefined undefined
D 2022-03-02T01:57:08.594Z | channel | (2) dns:dev.test.com:443 callRefTimer.ref | configSelectionQueue.length=0 pickQueue.length=1
D 2022-03-02T01:57:08.599Z | connectivity_state | (2) dns:dev.test.com:443 CONNECTING -> CONNECTING
D 2022-03-02T01:57:08.603Z | subchannel | (5) 172.128.1.2:443 creating HTTP/2 session
D 2022-03-02T01:57:08.621Z | subchannel | (5) 172.128.1.2:443 connection closed with error unable to verify the first certificate
D 2022-03-02T01:57:08.623Z | subchannel | (5) 172.128.1.2:443 connection closed
D 2022-03-02T01:57:08.625Z | subchannel | (5) 172.128.1.2:443 CONNECTING -> TRANSIENT_FAILURE
D 2022-03-02T01:57:08.627Z | pick_first | CONNECTING -> TRANSIENT_FAILURE
D 2022-03-02T01:57:08.629Z | resolving_load_balancer | dns:dev.test.com:443 CONNECTING -> TRANSIENT_FAILURE
D 2022-03-02T01:57:08.632Z | channel | (2) dns:dev.test.com:443 callRefTimer.unref | configSelectionQueue.length=0 pickQueue.length=0
D 2022-03-02T01:57:08.634Z | channel | (2) dns:dev.test.com:443 Pick result: TRANSIENT_FAILURE subchannel: undefined status: 14 No connection established
D 2022-03-02T01:57:08.636Z | call_stream | [1] cancelWithStatus code: 14 details: "No connection established"
D 2022-03-02T01:57:08.638Z | call_stream | [1] ended with status: code=14 details="No connection established"
D 2022-03-02T01:57:08.641Z | connectivity_state | (2) dns:dev.test.com:443 CONNECTING -> TRANSIENT_FAILURE
Connect to the same adress dev.test.com:443 (Forwarding grpc requests using ingress)
grpc 1.24.11 + Node 14.18.3 Success. grpc 1.24.11 + Node 12.16.2 Success.
grpc-js 1.5.7 + Node 14.18.3 Error, client hangs and no callback from server
grpc-js 1.5.7 + Node 12.16.2 Error, I got some error connection closed with error unable to verify the first certificate
In the last log, I cut out some lines because there are too many.
So, you included the lines that showed grpc
behaving the same as grpc-js
, and cut out the lines that showed it behaving differently. I hope you can see how that would be less than useful when trying to demonstrate how grpc
works and grpc-js
does not.
It looks like that error indicates that the root certificates you provide are not able to verify the certificate provided by the server. I notice that you set the option "grpc.ssl_target_name_override": "test.grpc.internal.com”
, but the root certificates file provided in the grpc
log has the common name *.grpc.jdy-internal.com
. That looks like a mismatch.
I'm not 100% if it's the same thing, but this might be related to #2090.
That's not the same. That bug only occurs if name resolution never succeeds.
You can ignore the IP and domain mentioned in the log. The domain name in the certificate is the same as the configuration in the code. I replaced the real domain name that appeared in the log with ’test‘. Since no solution was found to this problem, we decided to fall back to grpc 1.24.11.
After several tests, the following results were obtained:
Connect to the same adress dev.test.com:443 (Forwarding grpc requests using ingress)
grpc 1.24.11 + Node 14.18.3 Success. grpc 1.24.11 + Node 12.16.2 Success.
grpc-js 1.5.7 + Node 14.18.3 Error, client hangs and no callback from server grpc-js 1.5.7 + Node 12.16.2 Error, I got some error: 'connection closed with error unable to verify the first certificate'
I'm not 100% if it's the same thing, but this might be related to #2090.
I've tested version 1.6.4 and it's the same problem, cry....
I guess the current situation is different because the TLS module of node version is different.
grpc-js 1.5.7 + Node 14.18.3 Error, client hangs and no callback from server grpc-js 1.5.7 + Node 12.16.2 Error, I got some error: 'connection closed with error unable to verify the first certificate'
Problem description
The client hangs, no callback form server, but when use grpc 1.24.11 there is no problem. and if don't use ssl ,there is no problem too.
Environment
Additional context
Log1 with ssl: (ip and domain is no real)
then the client hangs
Log2 without ssl: every thing is ok!