zazuko / cube-creator

A tool to create RDF cubes from CSV files
GNU Affero General Public License v3.0
12 stars 2 forks source link

Transformation - FetchError ... reason: read ECONNRESET #1374

Open aresssera opened 1 year ago

aresssera commented 1 year ago

I created a cube with a portion (29177 rows) of the complete dataset, which worked. The complete dataset is large (543538 rows, 16 columns). @AFoletti But each time I try to do the transformation for the complete set, it runs for at least 2h and then I get the following error:

FetchError: request to https://stardog-int.cluster.ldbar.ch/cube-creator?graph=https%3A%2F%2Fint.cube-creator.lindas.admin.ch%2Fcube-project%2Fbfeogd100kennzahlensharedmobility-ddablvrpknd%2Fcube-data failed, reason: read ECONNRESET at ClientRequest.<anonymous> ([/app/node_modules/@zazuko/node-fetch/lib/index.js:1483:11](mailto:/app/node_modules/@zazuko/node-fetch/lib/index.js:1483:11)) at [/app/node_modules/@opentelemetry/context-async-hooks/build/src/AbstractAsyncHooksContextManager.js:49:55](mailto:/app/node_modules/@opentelemetry/context-async-hooks/build/src/AbstractAsyncHooksContextManager.js:49:55) at AsyncLocalStorage.run (async_hooks.js:314:14) at AsyncLocalStorageContextManager.with ([/app/node_modules/@opentelemetry/context-async-hooks/build/src/AsyncLocalStorageContextManager.js:33:40](mailto:/app/node_modules/@opentelemetry/context-async-hooks/build/src/AsyncLocalStorageContextManager.js:33:40)) at ClientRequest.contextWrapper ([/app/node_modules/@opentelemetry/context-async-hooks/build/src/AbstractAsyncHooksContextManager.js:49:32](mailto:/app/node_modules/@opentelemetry/context-async-hooks/build/src/AbstractAsyncHooksContextManager.js:49:32)) at ClientRequest.emit (events.js:412:35) at ClientRequest.emit (domain.js:475:12) at TLSSocket.socketErrorListener (_http_client.js:475:9) at TLSSocket.emit (events.js:400:28) at TLSSocket.emit (domain.js:475:12)

It also stops at different lines, but the error remains the same. image image

Might be related to #628.

AFoletti commented 1 year ago

Side question for technical ppl at zazuko (@tpluscode ?). Is it normal that the transformation of such a dataset (little more than 500k lines for 16 attributes) can take more than 2 hours?

dabo505 commented 1 year ago

@AFoletti Yes, it's normal that it takes that much time. @tpluscode is having a look and comes back to you.

AFoletti commented 1 year ago

Thanks @dabo505 for your answer. I think the same process that took more than 2 hours before failing with the error mentioned above did later (yesterday) work fine in less than 1 hour. I am of course happy it did, but this looks quite unreliable. I hope you can find something

tpluscode commented 1 year ago

We will keep looking but seeing how it succeeded on a later run suggests a transient issue.

I'm working on improving our pipeline monitoring which stopped working after 3rd party updates. That way we should be able to understand the issue better if it happens in the future