Azure / azure-sdk-for-js

This repository is for active development of the Azure SDK for JavaScript (NodeJS & Browser). For consumers of the SDK we recommend visiting our public developer docs at https://docs.microsoft.com/javascript/azure/ or our versioned developer docs at https://azure.github.io/azure-sdk-for-js.
MIT License
1.99k stars 1.16k forks source link

"Cannot read property 'ERROR' of undefined" raises on TableClient.submitTransaction #16112

Closed verikono closed 3 years ago

verikono commented 3 years ago

Describe the bug It appears the coreTracing structure has changed over time because coreTracing.SpanStatusCode seems to have dissapeared and the error in title is thrown when it attempts to access the ERROR property of SpanStatusCode.. which of course is undefined;

in the dist/index.js file: lines:3929-3927

            span.setStatus({
                code: coreTracing.SpanStatusCode.ERROR,
                message: error.message
            });
            throw error;
maorleger commented 3 years ago

Hi @verikono, thanks so much for opening this issue - sorry to hear about the breaking changes, that must be frustrating!

I'd like to get a little more information from you. I see that you are using @azure/data-table 12.0.0 - are you also pulling in any @opentelemetry packages directly or any other Azure packages? Could you share your package.json if possible?

I'm looking at the code for coreTracing and I see that SpanStatusCode.ERROR is defined in interfaces.ts. I also see that the bundled @azure/core-tracing package does transpile SpanStatusCode correctly and exports it correctly.

I know that @opentelemetry has changed their public API considerably as they were in preview, and only recently GA'd. it's possible that with a much older version of @azure/core-tracing being pulled in that you'd be seeing this.

Are you able to share a minimal repro of this issue? That would help me tremendously in figuring out what could be happening since I wasn't able to reproduce this locally.

Thanks in advance!

verikono commented 3 years ago

Hi @maorleger ,

Find the EXTREMELY experimental repo here : https://github.com/verikono/simple-datalake-client . I wouldn't normally share this because it's a) a learning project for me using azure w/ node streams and b) it has no idea what it wants to be yet. It seems it wants to be, and is fast becoming a library of stream modules (Readables, Writables , Transforms etc) with a few QoL methods.

From what you've told me regarding interfaces.ts I think you might be money on this issue being created elsewhere. I'll also follow that line of thinking and report in if I find something - I dont' want to waste your time if it's an issue from a library you've no responsibility for!

verikono commented 3 years ago

Allow me some time to sort some of this code out so I can give you a focused view on the actual problem. Will get back.

verikono commented 3 years ago

Right on the money @maorleger , right down to the package! I was importing a data tables client which had a fragment of stale code that imported applicationInsights.

Thank you for your time! It is greatly appreciated.

The error '"19:The batch request contains multiple changes with same row key. An entity can appear only once in a batch request" is now being thrown with "InvalidDuplicateRow' - exactly the behavior it should. Feel free to close this issue.

maorleger commented 3 years ago

@verikono my pleasure! I'll close this issue for now, but if you run into anything related to this feel free to reopen. And of course, we're here for you if you run into any other issues related to one of our packages so feel free to reach out!

By the way, I did take a quick look at the package you linked and completely unrelated: I think it's a really neat idea! Would love to see how it evolves over time (starred it :star:)

verikono commented 3 years ago

Thanks!

The place I work is getting extremely excited (to the point of standardising) parquet with the idea that it is a "database in a file". We have databricks and spark for major implementations but something small that handles streaming parquet, csv into a few transformers and whatever else could be handy somewhere along the line.