The export server should handle large amounts of data. The technical limitation is likely due to the Chrome DevTools protocol implementation, which is capped at 100 MB. This allows for approximately 12,979,200 data points to be transferred (assuming double precision at 8 bytes per point). However, some bytes should be reserved for other settings within the options object.
Additionally, the exporter crashes without a proper error message when the 100 MB limit is exceeded.
Actual behavior
Data is transferred as a string representation of the JSON object, which consumes more than twice the amount of data per data point. We are currently observing a limit of around 2,550,000 data points, resulting in a JSON string of roughly 100 MB. I expected it to be around 50 MB, but the data appears redundant in the transferred message (data.params.arguments[1].value.export.options and data.params.arguments[1].value.export.strInj).
The exporter should not crash when the limit is exceeded. Instead, it should log the specific reason.
One potential solution could be to serialize the JSON to BSON, transfer it as binary data, and then deserialize it back to JSON on the client side.
Reproduction steps
Render a chart with 3 million data points and observe the service crashing.
Expected behavior
The export server should handle large amounts of data. The technical limitation is likely due to the Chrome DevTools protocol implementation, which is capped at 100 MB. This allows for approximately 12,979,200 data points to be transferred (assuming double precision at 8 bytes per point). However, some bytes should be reserved for other settings within the options object.
Additionally, the exporter crashes without a proper error message when the 100 MB limit is exceeded.
Actual behavior
Data is transferred as a string representation of the JSON object, which consumes more than twice the amount of data per data point. We are currently observing a limit of around 2,550,000 data points, resulting in a JSON string of roughly 100 MB. I expected it to be around 50 MB, but the data appears redundant in the transferred message (data.params.arguments[1].value.export.options and data.params.arguments[1].value.export.strInj).
The exporter should not crash when the limit is exceeded. Instead, it should log the specific reason.
One potential solution could be to serialize the JSON to BSON, transfer it as binary data, and then deserialize it back to JSON on the client side.
Reproduction steps
Render a chart with 3 million data points and observe the service crashing.