porsager / postgres

Postgres.js - The Fastest full featured PostgreSQL client for Node.js, Deno, Bun and CloudFlare
The Unlicense
7.32k stars 265 forks source link

Copy stream hangs if using gzip stream compression #499

Open uasan opened 1 year ago

uasan commented 1 year ago

Hello, we caught a cool bug )

Streams work fine when copying, but if the stream is sent to the compression pipe, the entire pool of connections freezes.

Example, playback:

import postgres from 'postgres';
import { createGzip } from 'zlib';

const db = postgres({
  max: 2,
});

for (let i = 0; i < 10; i++) {
  let stream = await db
    .unsafe(`COPY pg_catalog.pg_attribute TO STDOUT`)
    .readable();

  stream.pipe(createGzip()).on('data', console.log);

  await db.unsafe(`SELECT $1 AS "CONTINUE"`, [i]);
}

console.log(await db.unsafe(`SELECT 'DONE' AS "END"`));

I play with pipe, but in any case, something is wrong with the connection pool after sending the stream to the pipe.

uasan commented 1 year ago

It seems that this problem was the cause of the connection errors that I described here https://github.com/porsager/postgres/issues/457. Due to the pipe, connections in the pool become busy in read mode, although postgres has returned the response and there is nothing to read

porsager commented 1 year ago

Sorry I haven't been on this @uasan .. I can repro here, so I hope I'll find some time to look into it soon!

uasan commented 1 year ago

I think this is due to the fact that the archiving stream sometimes pauses the stream

Ivor808 commented 1 month ago

I wonder if I am hitting a similar issue. Certain copy commands are read to stdout into a readable stream. Those readable streams sometimes just hang and it seems as if the "end" event is never transmitted.