Currently it is possible to stream Readable objects into the INSERT queries. This enables for fast mass data inserting, but often it is also required to serve large result sets for download or excel export requests. Currently the biggest restriction for these endpoints is the default 1000 result limit as the default batch size for an excel export from UI5 is set to 5000 rows. Which means that currently cap receives 5x the requests for a single excel export. The most important reason for the default 1000 limit is keeping the application from running out of memory. Therefor Result set streaming enforces a highWaterMark according the node standard streaming implementations. While this does not fully prevent a result set stream from using more memory then the highWaterMark it is a soft limit enforced on the stream and provides a balance between throughput and memory usage.
Raw stream
Up till now CAP has always loaded all data into javascript objects and processed it inside the javascript layer. With raw result set streaming the json result provided from the database connection is never parsed and is kept in a raw Buffer format. This greatly improves memory and cpu usage.
The big drawback is that the result cannot be manipulated inside the javascript layer. Therefor it is not possible to call after handlers on the result set. Additionally it is required for the protocol adapter to be able to handle the Readable and write it correclty onto the res body stream.
Object stream
For the cases that it is required to modify the results using javascript it is possible using the Object stream. Instead of loading the whole result as an Array into memory the same results are passed through as single Objects. Allowing the protocol adapters to serialize them back to JSON to be written back into the res body as they are processed.
While this does not benefit from the cpu usage benefits that come with Raw streams. There is still the memory usage benefits which can still result in reduced response times as V8 has garbage collection optimization for short lived objects.
Expand streams
When using Object streams it might be the case that the root result set is only a few rows, but each row has a large amount of children which would still all be loaded into memory and be counted as a single Object inside the highWaterMark. To prevent this from happening it is possible to apply a recursive streaming approach that handles all expand columns as Object streams as well.
Depending on the database connection protocol it might be required to still load all results into the Readable buffer as the order of the root and children rows are related. So when reading all the root entries would require loading all children into the buffer. As they are interlaced in the result set between the root entries.
Usage examples
These code examples are just examples as the APIs are not yet implemented in @sap/cds
Raw stream
The most common use case for Raw result set streams are protocol adapters. Where the final result has to be
const { pipeline } = require('stream')
app.on('*',async (req,res) => {
const result = await SELECT.from(entity)
await pipeline(result, res) // propagates errors to both the database and the http res stream
})
Object stream
The most common place for the Object stream usage would be in custom handlers that required to modify the data or do additional calculations.
let total = 0
for await(const row of SELECT.from(entity)) {
total += row.amount
}
Expand stream
This is an extension of the previous usage case, but with children rows.
let total = 0
for await(const row of SELECT.from(entity)) {
// for await works on async iterators and normal iterators this includes normal Arrays
for await(const child of row.children) {
total += child.amount
}
}
Result set streaming
Currently it is possible to stream
Readable
objects into theINSERT
queries. This enables for fast mass data inserting, but often it is also required to serve large result sets fordownload
orexcel export
requests. Currently the biggest restriction for these endpoints is the default1000
resultlimit
as the default batch size for anexcel export
fromUI5
is set to5000
rows. Which means that currently cap receives 5x the requests for a singleexcel export
. The most important reason for the default1000
limit is keeping the application from runningout of memory
. Therefor Result set streaming enforces ahighWaterMark
according thenode
standard streaming implementations. While this does not fully prevent a result set stream from using more memory then thehighWaterMark
it is a soft limit enforced on the stream and provides a balance between throughput and memory usage.Raw stream
Up till now CAP has always loaded all data into javascript objects and processed it inside the javascript layer. With raw result set streaming the
json
result provided from the database connection is never parsed and is kept in a rawBuffer
format. This greatly improves memory and cpu usage.The big drawback is that the result cannot be manipulated inside the javascript layer. Therefor it is not possible to call
after
handlers on the result set. Additionally it is required for the protocol adapter to be able to handle theReadable
and write it correclty onto theres
body stream.Object stream
For the cases that it is required to modify the results using javascript it is possible using the Object stream. Instead of loading the whole result as an
Array
into memory the same results are passed through as single Objects. Allowing the protocol adapters to serialize them back toJSON
to be written back into theres
body as they are processed.While this does not benefit from the cpu usage benefits that come with Raw streams. There is still the memory usage benefits which can still result in reduced response times as
V8
has garbage collection optimization for short lived objects.Expand streams
When using Object streams it might be the case that the root result set is only a few rows, but each row has a large amount of children which would still all be loaded into memory and be counted as a single Object inside the
highWaterMark
. To prevent this from happening it is possible to apply a recursive streaming approach that handles allexpand
columns as Object streams as well.Depending on the database connection protocol it might be required to still load all results into the
Readable
buffer as the order of the root and children rows are related. So when reading all the root entries would require loading all children into the buffer. As they are interlaced in the result set between the root entries.Usage examples
Raw stream
The most common use case for Raw result set streams are protocol adapters. Where the final result has to be
Object stream
The most common place for the Object stream usage would be in custom handlers that required to modify the data or do additional calculations.
Expand stream
This is an extension of the previous usage case, but with children rows.
PR Status
JSON
result stream