Closed yaohongkok closed 10 months ago
I'm seeing the same issue, and it seems to be related to this change: https://github.com/open-telemetry/opentelemetry-js/pull/4062/
The startTimeUnixNano
and endTimeUnixNano
are serialized as an object, e.g. {"low":1045401984,"high":395145071}
, instead of number or string, therefore collector can not parse it, hence throwing the ReadUint64: unsupported value type
error: https://github.com/open-telemetry/opentelemetry-collector/blob/main/pdata/internal/json/number.go#L71-L86
Based on the parser code, serializing the 64bit value to a string (instead of the object with low
high
fields) would solve this issue.
@lingyan Is there an older release of this open-telemetry-js that I could run without running into this issue?
@lingyan Is there an older release of this open-telemetry-js that I could run without running into this issue?
If you can, try downgrading @opentelemetry/exporter-trace-otlp-http
to 0.43.0
for now.
I'm seeing the same issue, and it seems to be related to this change: #4062
The
startTimeUnixNano
andendTimeUnixNano
are serialized as an object, e.g.{"low":1045401984,"high":395145071}
, instead of number or string, therefore collector can not parse it, hence throwing theReadUint64: unsupported value type
error: https://github.com/open-telemetry/opentelemetry-collector/blob/main/pdata/internal/json/number.go#L71-L86Based on the parser code, serializing the 64bit value to a string (instead of the object with
low
high
fields) would solve this issue.
The JSON encoding went unnoticed in my PR, where the intermediate types were are serialized directly with JSON.stringify
in the HTTP exporter. The conversion code needs another flag (e.g. to serialize fixed64
to strings). I'll try to get a fix in tomorrow.
One thing to consider is whether CI should run some sort of functional test where the output from NodeJS exporters are sent to an OpenTelemetry Collector - and ensure that the data is parsed and collected correctly.
Such a test would have failed before the original PR #4062 was merged.
I'm seeing the same issue, and it seems to be related to this change: #4062 The
startTimeUnixNano
andendTimeUnixNano
are serialized as an object, e.g.{"low":1045401984,"high":395145071}
, instead of number or string, therefore collector can not parse it, hence throwing theReadUint64: unsupported value type
error: open-telemetry/opentelemetry-collector@main
/pdata/internal/json/number.go#L71-L86 Based on the parser code, serializing the 64bit value to a string (instead of the object withlow
high
fields) would solve this issue.The JSON encoding went unnoticed in my PR, where the intermediate types were are serialized directly with
JSON.stringify
in the HTTP exporter. The conversion code needs another flag (e.g. to serializefixed64
to strings). I'll try to get a fix in tomorrow.
Thank you for picking this up @seemk
One thing to consider is whether CI should run some sort of functional test where the output from NodeJS exporters are sent to an OpenTelemetry Collector - and ensure that the data is parsed and collected correctly.
Such a test would have failed before the original PR #4062 was merged.
Agreed, we should have something like this. There's #2157 to track this. I added it to the Exporter GA Milestone.
Adding toString
from long.js
would pretty much bring in most of the library.
Wonder if it'd make sense to copy long.js to this codebase instead of copying only the relevant parts.
Another option would be to encode fixed64
as numbers in the JSON exporter, but this means the precision error is back.
I'm for just copying over long.js (as far as I know it can't be added as a dependency due to ESM), but this would cause the package size to go up and I think the JSON exporter is mostly used in browsers :thinking:
Is using JavaScript's built-int BigInt type an option?
Went with the BigInt route with a fallback to the old imprecise (doesn't really matter for browsers anyway) numeric value in case BigInt isn't available: https://github.com/open-telemetry/opentelemetry-js/pull/4220
This was quite the debug session to find the issue. Errors coming from the tools we used weren't helpful.
Give this appears to be a broken release (i assume) could a patch be released or retag latest on npm to 0.43.0? Personally it was my first time using OTEL and without a change log or anything to go off this was very hard to pin down.
This was quite the debug session to find the issue. Errors coming from the tools we used weren't helpful.
Give this appears to be a broken release (i assume) could a patch be released or retag latest on npm to 0.43.0? Personally it was my first time using OTEL and without a change log or anything to go off this was very hard to pin down.
+1 for releasing emergency patches in these cases
https://github.com/open-telemetry/opentelemetry-js/issues/4202#issuecomment-1760315879
This didnt work ... any other alternative fix available?
This didnt work ... any other alternative fix available?
Downgrade everything that's 0.44.0 to 0.43.0, this helped me
Hi all, any traction with this issue? Since it is a major break in functionality, if we believe that having a fix might require more time, I recommend reverting the change that introduced it and re-release asap. As it is now, the 0.44 version can't be used and there is no indication in any documentation of that fact.
Hi all, any traction with this issue? Since it is a major break in functionality, if we believe that having a fix might require more time, I recommend reverting the change that introduced it and re-release asap. As it is now, the 0.44 version can't be used and there is no indication in any documentation of that fact.
The fix has been in a PR state for a while: https://github.com/open-telemetry/opentelemetry-js/pull/4220
Thanks @seemk What is preventing a merge? I see some unit tests failing. Is that ok?
Thanks @seemk What is preventing a merge? I see some unit tests failing. Is that ok?
Looks like all PRs have the same issue currently: https://github.com/open-telemetry/opentelemetry-js/pulls :thinking:
@pichlermarc seems to be out of office. I wonder who else could approve this and generate a new release.
#4202 (comment) This didnt work ... any other alternative fix available?
Downgrade everything that's 0.44.0 to 0.43.0, this helped me
Thanks @mateusz-lisik .. didnt work
Downgraded these to 0.43.0
@opentelemetry/exporter-trace-otlp-http, @opentelemetry/instrumentation
but still facing the same issue
Any updates on this? It is blocking a project I'm working on.
Any updates on this? It is blocking a project I'm working on.
Welcome to the club :)
#4202 (comment) This didnt work ... any other alternative fix available?
Downgrade everything that's 0.44.0 to 0.43.0, this helped me
0.43 does not have this issue but it has another bug (fixed in 0.44) where the library cannot be used from a WebWorker.
Hello any progress?
Hello any progress?
They fixed it. I tested the latest version and it works well.
What happened?
Steps to Reproduce
1) Set up the collector as follows:
2) Run the collector:
3) Clone this repo and go into
examples/opentelemetry-web
.4) Run
npm install
andnpm run
.5) Visit
http://localhost:8090/fetch/
and click on 'Test'.6) Open up the developer console and under the Network tab, you will see
/traces
responding with 400s.Expected Result
The POST
/traces
requests should succeed.Actual Result
The POST
/traces
requests receives a 400 Bad Request. The response is:The request looks smth as follows:
Additional Details
I suspect this error has something to do with the negative numbers. But, I don't understand enough about the proper behavior of unsigned_long.js.
I am on an ARM Mac and I am not sure if this could be the cause of this issue? I know that my colleague is able to use opentelemetry-js without much problem.
OpenTelemetry Setup Code
package.json
Relevant log output
On the collector side, these are the output:
On the client side: