Closed shachart closed 8 years ago
postgres has a mechanism for this, large objects
Here's a module I made that lets you use them : https://www.npmjs.org/package/pg-large-object
While large object is probably a more performant way to do this I've hacked together a MongoDB GridFS like solution using Knex.js where the file is chunked to multiple rows.
https://gist.github.com/epeli/2907e43e162970d08568
While it may not be the fasted method it uses very little memory and should work on all databases supported by Knex.js. Although I've only tested it with Postgres.
My apologies if this was discussed before, I could not find a good answer for this.
I want to save (large-ish) files in my database (let's not examine whether this is a good idea - it's a toy project and it's better for this project than doing something which is better adapted). How would I go about streaming those files into the database, using node-pg, without buffering the entire files in RAM in node?
Cheers, Shachar