Closed vitaly-t closed 7 years ago
This is directly related to: #12.
I kind of have figured it out... by doing stream.on('data', () => { /* do internal increment */ })
.
But I have run into another issue with this: #32.
Thanks to my own hack, I was able before to process pages of rows, but now I'm facing a very inefficient row-by-row data feed from the stream.
Yeah, if you are reading from a stream and want to know how many records are coming in from the stream its trivial to keep a counter of rows as you read them from the stream. That's the recommended approach AFAIK.
When I did support of this library's v1.0 within
pg-promise
, I had to add a hack (code injection) for the read operation (method_fetch
) in order to pull such basic information as the number of rows that went through the stream.With version 1.1, do I have to write a new hack for it or is there now a civilized approach to getting the number of rows once the streaming has finished?
This is the hack that I had to do for v1.0: