Create a postgres database with some database table fix.event_log
Run this script in both nodejs and bun. Nodejs finishes as expected, bun gets stuck on pipeline.
import pg from "pg";
import { to as copyTo } from "pg-copy-streams";
import { pipeline } from "node:stream/promises";
import fs from "fs";
const { Client } = pg;
const tables = [
{ table: "fix.event_log" },
];
const dir = "./data"
if (!fs.existsSync(dir)){
fs.mkdirSync(dir, { recursive: true });
}
const client = new Client({
connectionString: "postgresql://postgres:password@localhost:8000/postgres",
});
await client.connect();
for (const t of tables) {
const csvFilePath = `${dir}/${t.table}.csv`;
const writableStream = fs.createWriteStream(csvFilePath);
const stream = client.query(copyTo(`COPY (SELECT * FROM ${t.table}) TO STDOUT WITH CSV HEADER`));
console.log("start pipeline")
await pipeline(stream, writableStream);
console.log("end pipeline")
}
console.log("Data exported");
await client.end()
Im not sure if there's a better way to utilize Bun's file I/O to better write to the file. I've tried different approaches but haven't gotten any to work.
What version of Bun is running?
1.1.3+2615dc742
What platform is your computer?
Darwin 23.4.0 arm64 arm
What steps can reproduce the bug?
Create a postgres database with some database table fix.event_log Run this script in both nodejs and bun. Nodejs finishes as expected, bun gets stuck on pipeline.
and package.json
What is the expected behavior?
Same behavior for both nodejs and bun
What do you see instead?
never finishes in bun
Additional information
Im not sure if there's a better way to utilize Bun's file I/O to better write to the file. I've tried different approaches but haven't gotten any to work.