Appeared that CSV streaming was lacking error handling, and causing the intermittent zap-api failures:
internal/streams/legacy.js:59
throw er; // Unhandled stream error in pipe.
^
Error: Data should not be empty or the "fields" option should be included
at Parser.parser.onToken (/app/node_modules/json2csv/dist/json2csv.cjs.js:4604:24)
at Parser.proto.write (/app/node_modules/json2csv/dist/json2csv.cjs.js:4225:34)
at JSON2CSVTransform._transform (/app/node_modules/json2csv/dist/json2csv.cjs.js:4642:19)
at JSON2CSVTransform.Transform._read (_stream_transform.js:186:10)
at JSON2CSVTransform.Transform._write (_stream_transform.js:174:12)
at doWrite (_stream_writable.js:396:12)
at writeOrBuffer (_stream_writable.js:382:5)
at JSON2CSVTransform.Writable.write (_stream_writable.js:290:11)
at Stream.ondata (internal/streams/legacy.js:16:26)
at emitOne (events.js:116:13)
at Stream.emit (events.js:211:7)
at drain (/app/node_modules/through/index.js:36:16)
at Stream.stream.queue.stream.push (/app/node_modules/through/index.js:45:5)
at Stream.<anonymous> (/app/node_modules/JSONStream/index.js:203:12)
at _end (/app/node_modules/through/index.js:65:9)
at Stream.stream.end (/app/node_modules/through/index.js:74:5)
npm ERR! code ELIFECYCLE
Since we are about to introduce other download endpoints that do not use streaming, and have determined that even if a user requests the download the entire dataset the app can process this fine with working memory, the simplest fix here was just to remove the streaming logic and do a plain old data query and then dump to response :)
Appeared that CSV streaming was lacking error handling, and causing the intermittent zap-api failures:
Since we are about to introduce other download endpoints that do not use streaming, and have determined that even if a user requests the download the entire dataset the app can process this fine with working memory, the simplest fix here was just to remove the streaming logic and do a plain old data query and then dump to response :)