Closed prana24 closed 2 years ago
We have jaegertracing 1.14 deployed on azure kubernetes cluster . Collector service is defined as loadbalancer with annotation , external-dns.alpha.kubernetes.io/hostname ,
our agent connects the collector using the service dns name ,
I see warning logs in collector log , and making me worried is there anything wrong. I see traces and spans coming in to jaeger back end , i.e. Elastic Search.
Below is the log which i keep observing.
WARNING: 2020/04/07 18:40:43 grpc: Server.Serve failed to create ServerTransport: connection error: desc = "transport: http2Server.HandleStreams failed to receive the preface from client: read tcp 192.168.130.18:14250->10.128.36.40:50417: read: connection reset by peer"
Any idea/suggestion will be highly appreciated.
@prana24 could you please update to Jaeger 1.17.1?
@pavolloffay getting this kind of warning even on 1.17.1
^^
I have noticed this in 1.18 version with storage backend to kafka.
{"level":"warn","ts":1591511301.9598265,"caller":"grpc@v1.27.1/server.go:669","msg":"grpc: Server.Serve failed to create ServerTransport: connection error: desc = \"transport: http2Server.HandleStreams failed to receive the preface from client: EOF\"","system":"grpc","grpc_log":true}
Are you seeing this message periodically? Are you seeing traces "without root span", possibly created at around the time such message occurs?
Without extra context, I would just not worry about those messages, as they are probably just saying that a networking failure happened between the agent and the collector. IIRC, gRPC will just retry, so, no data should have been lost.
I forgot to update this issue, but I could pin-point it to 'cloud provider' healthchecks on k8s nodes. As this relates to grpc, I've made an issue here: https://github.com/grpc/grpc-go/issues/4234 - When this gets resolved, the package could be updated in Jaeger.
This is now fixed in grpc-go, would be nice to upgrade this package when there is a new release
Unfortunately, I think updating grpc-go is a bit problematic right now. @joe-elliott and @pavolloffay have tried in the past, with no luck.
Unfortunately, I think updating grpc-go is a bit problematic right now. @joe-elliott and @pavolloffay have tried in the past, with no luck.
Ai, that does not boost my confidence xD - Do you perhaps have any related issues/PR's so I can have a look what was blocking? At the moment this issue is spamming our logs insanely. So I have motivation to get it fixed (:
I think the problem is related to gogo, which doesn't seem to work with any recent version of gRPC. And moving away from gogo would cause performance regressions. We have to eventually find a solution to this, as we are one CVE away from being forced to make a decision in a rush...
Perhaps @pavolloffay and @joe-elliott can add comments based on their recollection of the matter?
Maybe we should look into otel collector's pdata, which iirc avoids default proto perf issues.
Here is my attempt at upgrading:
https://github.com/jaegertracing/jaeger/pull/2857
The last few comments show the final hurdles I could not get past.
Perhaps this could be useful: https://vitess.io/blog/2021-06-03-a-new-protobuf-generator-for-go/
^ this is a great find, I think we should try it out. At minimum, it will unblock us to upgrade proto & grpc.
Could this be re-opened as even though we are upgrading to 1.38, the fix for this issue itself is not yet included in 1.38, as this has been made 6 days ago :) It would require an update to 1.39 or whatever version that includes grpc/grpc-go#4234
Thanks a lot for this upgrade move though. Awesome <3
It may be related to https://github.com/grpc/grpc-go/issues/875
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
This issue has been automatically closed due to inactivity.
Hey Peeps! This issue is still happening. I've had the same issue on the same scenario reported by @prana24
{"level":"info","ts":1685345493.0991235,"caller":"grpc@v1.54.0/server.go:935","msg":"[core][Server #1] grpc: Server.Serve failed to create ServerTransport: connection error: desc = \"transport: http2Server.HandleStreams received bogus greeting from client: \\"POST /api/v2/spans HTTP/\\"\"","system":"grpc","grpc_log":true}
{"level":"info","ts":1709008560.3461268,"caller":"grpc@v1.61.0/server.go:994","msg":"[core][Server #5] grpc: Server.Serve failed to create ServerTransport: connection error: desc = \"transport: http2Server.HandleStreams received bogus greeting from client: \\"POST /v1/traces HTTP/1.1\\"\"","system":"grpc","grpc_log":true}
sending HTTP to gRPC port
Requirement - what kind of business use case are you trying to solve?
Problem - what in Jaeger blocks you from solving the requirement?
Proposal - what do you suggest to solve the problem or improve the existing situation?
Any open questions to address