Open NhatVuDuy opened 1 week ago
Hi @NhatVuDuy, it appears that the issue is with Otel-go-auto parsing the go modules for analyzing offsets. It discovered a module with version (devel), e.g. "Malformed version: (devel)". It's something we should fix. Do you mind sharing what your go.mod looks like if possible?
It looks like the log output was truncated. Can you post the full message?
It looks like the log output was truncated. Can you post the full message?
Hi @MrAlias, This is the full issue log from otel k8s pod log. Do you mean the full log of starting process of otel in k8s pod?
Thank you!
Hi @NhatVuDuy, it appears that the issue is with Otel-go-auto parsing the go modules for analyzing offsets. It discovered a module with version (devel), e.g. "Malformed version: (devel)". It's something we should fix. Do you mind sharing what your go.mod looks like if possible?
Thank you for your reply. I trace the log and can to get the section in otel auto instrumentation source code, where the issue is. It's loading probe step, with version issue as you said. But I don't know why the issue is, with kafka service only.
For go.mod, sorry I can't show the full content, because it's not my personal project. But I can share you the relevant info, like the version of packages which you need. Please tell me which you need.
Thank you so much!
@NhatVuDuy are you sure that is the full output of the auto-instrumentation container log? It looks like it got cut off at the end here:
"0.8.1","github.com/prometheus/client_golan
(Also if you paste the log in 3 backticks like ``` it can be easier to read)
@damemi Thank you. You're right. The log is cut off at the end. Have you just edited it? I see the log is different from my record. Should I re-run and get the full log of otel container again, then update into my issue record?
@NhatVuDuy I copied and pasted exactly what you shared, please re-run to get the full log of the otel container and share here. Thanks!
@damemi Here it is. Hope this is correct. Thank you!
{"level":"info","ts":1725501606.8059921,"logger":"go.opentelemetry.io/auto","caller":"cli/main.go:86","msg":"building OpenTelemetry Go instrumentation ...","globalImpl":false}
{"level":"info","ts":1725501608.806907,"logger":"Instrumentation.Analyzer","caller":"process/discover.go:67","msg":"found process","pid":7}
{"level":"error","ts":1725501608.8076844,"logger":"Instrumentation.Analyzer","caller":"process/analyze.go:83","msg":"error parsing module version","error":"Malformed version: (devel)","stacktrace":"go.opentelemetry.io/auto/internal/pkg/process.(*Analyzer).Analyze\n\t/app/internal/pkg/process/analyze.go:83\ngo.opentelemetry.io/auto.NewInstrumentation\n\t/app/instrumentation.go:133\nmain.main\n\t/app/cli/main.go:104\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:271"}
{"level":"info","ts":1725501608.8363714,"logger":"Instrumentation.Allocate","caller":"process/allocate.go:73","msg":"Detaching from process","pid":7}
{"level":"info","ts":1725501608.84071,"logger":"Instrumentation","caller":"app/instrumentation.go:144","msg":"target process analysis completed","pid":7,"go_version":"1.20.2","dependencies":{"github.com/beorn7/perks":"1.0.1","github.com/cespare/xxhash/v2":"2.1.1","github.com/eclipse/paho.mqtt.golang":"1.2.0","github.com/fsnotify/fsnotify":"1.4.7","github.com/go-kit/kit":"0.10.0","github.com/go-logfmt/logfmt":"0.5.0","github.com/go-sql-driver/mysql":"1.4.1","github.com/go-stack/stack":"1.8.0","github.com/go-zoo/bone":"1.3.0","github.com/gocql/gocql":"0.0.0-20200511135441-57b003a04490","github.com/gofrs/uuid":"3.3.0+incompatible","github.com/golang/protobuf":"1.4.2","github.com/golang/snappy":"0.0.1","github.com/google/uuid":"1.1.1","github.com/hailocab/go-hostpool":"0.0.0-20160125115350-e80d13ce29ed","github.com/jmoiron/sqlx":"1.2.1-0.20190319043955-cdf62fdf55f6","github.com/klauspost/compress":"1.9.5","github.com/lib/pq":"1.8.0","github.com/matttproud/golang_protobuf_extensions":"1.0.1","github.com/opentracing/opentracing-go":"1.1.0","github.com/pkg/errors":"0.8.1","github.com/prometheus/client_golang":"1.6.0","github.com/prometheus/client_model":"0.2.0","github.com/prometheus/common":"0.9.1","github.com/prometheus/procfs":"0.0.11","github.com/rubenv/sql-migrate":"0.0.0-20200616145509-8d140a17f351","github.com/subosito/gotenv":"1.2.0","github.com/uber/jaeger-lib":"2.3.0+incompatible","github.com/xdg/scram":"0.0.0-20180814205039-7eeb5667e42c","github.com/xdg/stringprep":"1.0.1-0.20180714160509-73f8eece6fdc","go.mongodb.org/mongo-driver":"1.3.3","go.uber.org/atomic":"1.5.0","golang.org/x/crypto":"0.0.0-20200622213623-75b288015ac9","golang.org/x/net":"0.0.0-20201224014010-6772e930b67b","golang.org/x/sync":"0.0.0-20201020160332-67f06af15bc9","golang.org/x/sys":"0.0.0-20201119102817-f84b799fce68","golang.org/x/text":"0.3.3","google.golang.org/genproto":"0.0.0-20200513103714-09dca8ec2884","google.golang.org/grpc":"1.29.1","google.golang.org/protobuf":"1.23.0","gopkg.in/gorp.v1":"1.7.2","gopkg.in/inf.v0":"0.9.1","gopkg.in/ini.v1":"1.51.0","gopkg.in/mail.v2":"2.3.1","std":"1.20.2"},"total_functions_found":11}
{"level":"info","ts":1725501608.8410597,"logger":"go.opentelemetry.io/auto","caller":"cli/main.go:119","msg":"starting instrumentation..."}
{"level":"info","ts":1725501608.8414338,"logger":"Instrumentation.Manager","caller":"instrumentation/manager.go:197","msg":"loading probe","name":"net/http/client"}
{"level":"info","ts":1725501609.0524049,"logger":"Instrumentation.Manager","caller":"instrumentation/manager.go:197","msg":"loading probe","name":"database/sql/client"}
{"level":"info","ts":1725501609.1308398,"logger":"Instrumentation.Manager","caller":"instrumentation/manager.go:197","msg":"loading probe","name":"github.com/segmentio/kafka-go/consumer"}
{"level":"error","ts":1725501609.1316051,"logger":"Instrumentation.Manager","caller":"instrumentation/manager.go:200","msg":"error while loading probes, cleaning up","name":"github.com/segmentio/kafka-go/consumer","error":"unknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go","stacktrace":"go.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).load\n\t/app/internal/pkg/instrumentation/manager.go:200\ngo.opentelemetry.io/auto/internal/pkg/instrumentation.(*Manager).Run\n\t/app/internal/pkg/instrumentation/manager.go:143\ngo.opentelemetry.io/auto.(*Instrumentation).Run\n\t/app/instrumentation.go:162\nmain.main\n\t/app/cli/main.go:120\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:271"}
{"level":"error","ts":1725501609.970436,"logger":"go.opentelemetry.io/auto","caller":"cli/main.go:121","msg":"instrumentation crashed","error":"unknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go\nunknown module version: github.com/segmentio/kafka-go","stacktrace":"main.main\n\t/app/cli/main.go:121\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:271"}
It looks like your version of kafka-go is unknown, Do you happen to be using a local/development version of kafka-go/consumer?
I used kafka from standard dependency: https://github.com/segmentio/kafka-go (go vendor mode).
(go vendor mode).
What happens when you run this with the upstream module directly?
I'm guessing this is where that package is reporting a devel
version from.
Let me try to check this. Because the system is separated from direct internet. So I have to use vendor mode to transfer dependencies via file transfer system.
Describe the bug
I got a crash when using otel auto instrumentation docker container with the main container has kafka service.
Environment
To Reproduce
Steps to reproduce the behavior: Run otel auto instrumentation docker container with main docker container. Got the error if main container has kafka service.
Expected behavior
A clear and concise description of what you expected to happen.
Additional context
The log is:
Please give me any suggestion. Thank you so much.