Closed cfis closed 1 year ago
This is similar to https://github.com/jaegertracing/jaeger/issues/3852 but for the jaeger operator.
Install using the operator and you get the error:
transport: Error while dialing: dial tcp :16685: connect: connection refused
Install jaeger operator:
helm install jaeger-operator jaegertracing/jaeger-operator -n observability -f jaeger.yaml
Where jaegery.yaml is:
jaeger: create: true namespace: observability spec: strategy: allInOne ingress: enabled: true secretName: <redacted> hosts: - <redacted>
Will create a jaeger operator pod and jaeger pod (jaeger-operator-jaeger-7d75c59dbc-249c5).
For port 16685 to work.
Note that port 16685 is not exposed by the container. Looks like that is set here:
https://github.com/jaegertracing/jaeger-operator/blob/723105ff90f5dbe2d49258adfbcce9a217f7399c/pkg/deployment/all_in_one.go#L156
Not sure that fixes the issue or not though.
Also note I am testing this on minikube, although hopefully that should make no difference.
Logs from jaeger-operator-jaeger-7d75c59dbc-249c5:
2023/06/15 03:03:42 maxprocs: Leaving GOMAXPROCS=8: CPU quota undefined {"level":"info","ts":1686798222.5290956,"caller":"flags/service.go:119","msg":"Mounting metrics handler on admin server","route":"/metrics"} {"level":"info","ts":1686798222.529127,"caller":"flags/service.go:125","msg":"Mounting expvar handler on admin server","route":"/debug/vars"} {"level":"info","ts":1686798222.5292172,"caller":"flags/admin.go:129","msg":"Mounting health check on admin server","route":"/"} {"level":"info","ts":1686798222.5292404,"caller":"flags/admin.go:143","msg":"Starting admin HTTP server","http-addr":":14269"} {"level":"info","ts":1686798222.5292473,"caller":"flags/admin.go:121","msg":"Admin server started","http.host-port":"[::]:14269","health-status":"unavailable"} {"level":"info","ts":1686798222.530166,"caller":"memory/factory.go:66","msg":"Memory storage initialized","configuration":{"MaxTraces":0}} {"level":"info","ts":1686798222.5304503,"caller":"static/strategy_store.go:138","msg":"Loading sampling strategies","filename":"/etc/jaeger/sampling/sampling.json"} {"level":"info","ts":1686798222.5437305,"caller":"grpc@v1.54.0/server.go:632","msg":"[core][Server #1] Server created","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5438118,"caller":"server/grpc.go:104","msg":"Starting jaeger-collector gRPC server","grpc.host-port":"[::]:14250"} {"level":"info","ts":1686798222.543826,"caller":"server/http.go:56","msg":"Starting jaeger-collector HTTP server","http host-port":":14268"} {"level":"info","ts":1686798222.5438533,"caller":"grpc@v1.54.0/server.go:820","msg":"[core][Server #1 ListenSocket #2] ListenSocket created","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5439267,"caller":"server/zipkin.go:57","msg":"Listening for Zipkin HTTP traffic","zipkin host-port":":9411"} {"level":"warn","ts":1686798222.5533488,"caller":"internal/warning.go:51","msg":"Using the 0.0.0.0 address exposes this server to every network interface, which may facilitate Denial of Service attacks","documentation":"https://github.com/open-telemetry/opentelemetry-collector/blob/main/docs/security-best-practices.md#safeguards-against-denial-of-service-attacks"} {"level":"info","ts":1686798222.5533893,"caller":"grpc@v1.54.0/server.go:632","msg":"[core][Server #3] Server created","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5533977,"caller":"otlpreceiver@v0.76.1/otlp.go:94","msg":"Starting GRPC server","endpoint":"0.0.0.0:4317"} {"level":"warn","ts":1686798222.5534341,"caller":"internal/warning.go:51","msg":"Using the 0.0.0.0 address exposes this server to every network interface, which may facilitate Denial of Service attacks","documentation":"https://github.com/open-telemetry/opentelemetry-collector/blob/main/docs/security-best-practices.md#safeguards-against-denial-of-service-attacks"} {"level":"info","ts":1686798222.5534477,"caller":"otlpreceiver@v0.76.1/otlp.go:112","msg":"Starting HTTP server","endpoint":"0.0.0.0:4318"} {"level":"info","ts":1686798222.5534656,"caller":"grpc@v1.54.0/server.go:820","msg":"[core][Server #3 ListenSocket #4] ListenSocket created","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5534742,"caller":"grpc/builder.go:73","msg":"Agent requested insecure grpc connection to collector(s)"} {"level":"info","ts":1686798222.553493,"caller":"grpc@v1.54.0/clientconn.go:105","msg":"[core][Channel #5] Channel created","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5535097,"caller":"grpc@v1.54.0/clientconn.go:1569","msg":"[core][Channel #5] original dial target is: \":14250\"","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5535252,"caller":"grpc@v1.54.0/clientconn.go:1574","msg":"[core][Channel #5] dial target \":14250\" parse failed: parse \":14250\": missing protocol scheme","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5535316,"caller":"grpc@v1.54.0/clientconn.go:1589","msg":"[core][Channel #5] fallback to scheme \"passthrough\"","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5535557,"caller":"grpc@v1.54.0/clientconn.go:1597","msg":"[core][Channel #5] parsed dial target is: {Scheme:passthrough Authority: URL:{Scheme:passthrough Opaque: User: Host: Path:/:14250 RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5535626,"caller":"grpc@v1.54.0/clientconn.go:273","msg":"[core][Channel #5] Channel authority set to \"166:14250\"","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5536456,"caller":"grpc@v1.54.0/resolver_conn_wrapper.go:175","msg":"[core][Channel #5] Resolver state updated: {\n \"Addresses\": [\n {\n \"Addr\": \":14250\",\n \"ServerName\": \"\",\n \"Attributes\": null,\n \"BalancerAttributes\": null,\n \"Type\": 0,\n \"Metadata\": null\n }\n ],\n \"ServiceConfig\": null,\n \"Attributes\": null\n} (resolver returned new addresses)","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.553675,"caller":"grpc@v1.54.0/balancer_conn_wrappers.go:274","msg":"[core][Channel #5] Channel switches to new LB policy \"round_robin\"","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.553698,"caller":"grpc@v1.54.0/balancer_conn_wrappers.go:306","msg":"[core][Channel #5 SubChannel #6] Subchannel created","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5537117,"caller":"base/balancer.go:177","msg":"[roundrobin]roundrobinPicker: Build called with info: {map[]}","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5537233,"caller":"grpc@v1.54.0/clientconn.go:428","msg":"[core][Channel #5] Channel Connectivity change to CONNECTING","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5537734,"caller":"grpc/builder.go:113","msg":"Checking connection to collector"} {"level":"info","ts":1686798222.553777,"caller":"grpc@v1.54.0/clientconn.go:1117","msg":"[core][Channel #5 SubChannel #6] Subchannel Connectivity change to CONNECTING","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.553781,"caller":"grpc/builder.go:124","msg":"Agent collector connection state change","dialTarget":":14250","status":"CONNECTING"} {"level":"info","ts":1686798222.553798,"caller":"grpc@v1.54.0/clientconn.go:1231","msg":"[core][Channel #5 SubChannel #6] Subchannel picks a new address \":14250\" to connect","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5543125,"caller":"grpc@v1.54.0/clientconn.go:1117","msg":"[core][Channel #5 SubChannel #6] Subchannel Connectivity change to READY","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5544164,"caller":"base/balancer.go:177","msg":"[roundrobin]roundrobinPicker: Build called with info: {map[0xc00028fbf0:{{\n \"Addr\": \":14250\",\n \"ServerName\": \"\",\n \"Attributes\": null,\n \"BalancerAttributes\": null,\n \"Type\": 0,\n \"Metadata\": null\n}}]}","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5544362,"caller":"grpc@v1.54.0/clientconn.go:428","msg":"[core][Channel #5] Channel Connectivity change to READY","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5544813,"caller":"./main.go:256","msg":"Starting agent"} {"level":"info","ts":1686798222.5545537,"caller":"grpc/builder.go:124","msg":"Agent collector connection state change","dialTarget":":14250","status":"READY"} {"level":"info","ts":1686798222.5545766,"caller":"querysvc/query_service.go:134","msg":"Archive storage not created","reason":"archive storage not supported"} {"level":"info","ts":1686798222.5545824,"caller":"app/agent.go:69","msg":"Starting jaeger-agent HTTP server","http-port":5778} {"level":"info","ts":1686798222.5545893,"caller":"app/flags.go:141","msg":"Archive storage not initialized"} {"level":"info","ts":1686798222.5546787,"caller":"grpc@v1.54.0/server.go:632","msg":"[core][Server #9] Server created","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.554708,"caller":"grpc@v1.54.0/clientconn.go:105","msg":"[core][Channel #10] Channel created","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5547187,"caller":"grpc@v1.54.0/clientconn.go:1569","msg":"[core][Channel #10] original dial target is: \":16685\"","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5547292,"caller":"grpc@v1.54.0/clientconn.go:1574","msg":"[core][Channel #10] dial target \":16685\" parse failed: parse \":16685\": missing protocol scheme","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5547342,"caller":"grpc@v1.54.0/clientconn.go:1589","msg":"[core][Channel #10] fallback to scheme \"passthrough\"","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5547466,"caller":"grpc@v1.54.0/clientconn.go:1597","msg":"[core][Channel #10] parsed dial target is: {Scheme:passthrough Authority: URL:{Scheme:passthrough Opaque: User: Host: Path:/:16685 RawPath: OmitHost:false ForceQuery:false RawQuery: Fragment: RawFragment:}}","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5547526,"caller":"grpc@v1.54.0/clientconn.go:273","msg":"[core][Channel #10] Channel authority set to \"localhost:16685\"","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5547757,"caller":"grpc@v1.54.0/resolver_conn_wrapper.go:175","msg":"[core][Channel #10] Resolver state updated: {\n \"Addresses\": [\n {\n \"Addr\": \":16685\",\n \"ServerName\": \"\",\n \"Attributes\": null,\n \"BalancerAttributes\": null,\n \"Type\": 0,\n \"Metadata\": null\n }\n ],\n \"ServiceConfig\": null,\n \"Attributes\": null\n} (resolver returned new addresses)","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5547972,"caller":"grpc@v1.54.0/balancer_conn_wrappers.go:274","msg":"[core][Channel #10] Channel switches to new LB policy \"pick_first\"","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.554808,"caller":"grpc@v1.54.0/balancer_conn_wrappers.go:306","msg":"[core][Channel #10 SubChannel #11] Subchannel created","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.554821,"caller":"grpc@v1.54.0/clientconn.go:428","msg":"[core][Channel #10] Channel Connectivity change to CONNECTING","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.554839,"caller":"grpc@v1.54.0/clientconn.go:1117","msg":"[core][Channel #10 SubChannel #11] Subchannel Connectivity change to CONNECTING","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.554851,"caller":"grpc@v1.54.0/clientconn.go:1231","msg":"[core][Channel #10 SubChannel #11] Subchannel picks a new address \":16685\" to connect","system":"grpc","grpc_log":true} {"level":"warn","ts":1686798222.5549424,"caller":"grpc@v1.54.0/clientconn.go:1292","msg":"[core][Channel #10 SubChannel #11] grpc: addrConn.createTransport failed to connect to {\n \"Addr\": \":16685\",\n \"ServerName\": \"localhost:16685\",\n \"Attributes\": null,\n \"BalancerAttributes\": null,\n \"Type\": 0,\n \"Metadata\": null\n}. Err: connection error: desc = \"transport: Error while dialing: dial tcp :16685: connect: connection refused\"","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5549562,"caller":"grpc@v1.54.0/clientconn.go:1119","msg":"[core][Channel #10 SubChannel #11] Subchannel Connectivity change to TRANSIENT_FAILURE, last error: connection error: desc = \"transport: Error while dialing: dial tcp :16685: connect: connection refused\"","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.554966,"caller":"grpc@v1.54.0/clientconn.go:428","msg":"[core][Channel #10] Channel Connectivity change to TRANSIENT_FAILURE","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5550945,"caller":"app/server.go:215","msg":"Query server started","http_addr":"[::]:16686","grpc_addr":"[::]:16685"} {"level":"info","ts":1686798222.5551047,"caller":"healthcheck/handler.go:129","msg":"Health Check state change","status":"ready"} {"level":"info","ts":1686798222.5551147,"caller":"app/server.go:298","msg":"Starting GRPC server","port":16685,"addr":":16685"} {"level":"info","ts":1686798222.5551238,"caller":"grpc@v1.54.0/server.go:820","msg":"[core][Server #9 ListenSocket #12] ListenSocket created","system":"grpc","grpc_log":true} {"level":"info","ts":1686798222.5551424,"caller":"app/server.go:279","msg":"Starting HTTP server","port":16686,"addr":":16686"} {"level":"info","ts":1686798223.5551968,"caller":"grpc@v1.54.0/clientconn.go:1119","msg":"[core][Channel #10 SubChannel #11] Subchannel Connectivity change to IDLE, last error: connection error: desc = \"transport: Error while dialing: dial tcp :16685: connect: connection refused\"","system":"grpc","grpc_log":true} {"level":"info","ts":1686798223.5552473,"caller":"grpc@v1.54.0/clientconn.go:428","msg":"[core][Channel #10] Channel Connectivity change to IDLE","system":"grpc","grpc_log":true}
None
Installed on minikube
1.45.0
Jaeger operator helm chart
No response
Memory
Fedora 38
Helm
See above
Duplicated from #2041.
What happened?
This is similar to https://github.com/jaegertracing/jaeger/issues/3852 but for the jaeger operator.
Install using the operator and you get the error:
Steps to reproduce
Install jaeger operator:
Where jaegery.yaml is:
Will create a jaeger operator pod and jaeger pod (jaeger-operator-jaeger-7d75c59dbc-249c5).
Expected behavior
For port 16685 to work.
Note that port 16685 is not exposed by the container. Looks like that is set here:
https://github.com/jaegertracing/jaeger-operator/blob/723105ff90f5dbe2d49258adfbcce9a217f7399c/pkg/deployment/all_in_one.go#L156
Not sure that fixes the issue or not though.
Also note I am testing this on minikube, although hopefully that should make no difference.
Relevant log output
Logs from jaeger-operator-jaeger-7d75c59dbc-249c5:
Screenshot
None
Additional context
Installed on minikube
Jaeger backend version
1.45.0
SDK
Jaeger operator helm chart
Pipeline
No response
Stogage backend
Memory
Operating system
Fedora 38
Deployment model
Helm
Deployment configs
See above