Closed JeyanthVictory closed 4 years ago
To merge all files that have extension .part
in the current directory:
const fs = require('fs').promises;
(async () => {
const files = (await fs.readdir('.')).filter(a => a.endsWith('.part'));
const parts = [];
for(const file of files)
parts.push(await fs.readFile(file));
await fs.writeFile('All parts', Buffer.concat(parts));
})().catch(console.log);
@Hakerh400 ,Thanks for your valuable suggestion. Unfortunately, it throws error. Here is what I've tried. https://codeshare.io/2KYxBM
That is because you didn't properly copy my script. Notice the line const fs = require('fs').promises
, you either didn't include that, or you removed the .promises
part.
The issue template asked you about Node.js version, and you simply removed that. So, I assumed you're using Node.js v12.x, but according to the error message, it is older than v10.x. Please post your Node.js version and for the script to work, you should either update your Node.js to v12x, or you can use synchronous conterparts of fs
methods (add Sync
after method names).
@Hakerh400 , I just included require()
as you mentioned and I'm using Node.js 8.9.3. Here you can see my code.
https://codeshare.io/2KYxBM
Yes, I had already seen your code, but I said that you didn't properly use it. In my code, there is line const fs = require('fs').promises
, which you included, but renamed to fs1
, but then you nowhere use it (you kept fs
, while you should rename it to fs1
).
Given that your Node.js version is 8.9.3, there are no fs.promises
API, so you should promisify them first. Here is the fixed code:
Hope it helps.
@Hakerh400 , Thank you for your great stuff..It works!!!
@Hakerh400 , I have one more doubt. If I want to delete all the .part files after getting merged, what I am supposed to do..?? Since it is an asynchronous function, I am getting stuck where to delete those junks. I tried with callback of fs.writefile()
, but it never hits.
After line writeFile: util.promisify(fs.writeFile),
add
unlink: util.promisify(fs.unlink),
and after line await fs.writeFile('All parts', Buffer.concat(parts));
add
for(const file of files)
await fs.unlink(file);
@Hakerh400 , Great!!
@Hakerh400 , Consider this case, If I have chunks with size of 10GB, I could not merge those chunks as it exceeds Buffer size. What I am supposed to do to get rid of this.?
Thanks in advance!
This allows unlimited file size:
(async () => {
const read = file => new Promise((a, b) =>
fs.createReadStream(file).
on('data', a => ws.write(a)).
on('end', a).on('error', b)
);
const files = (await fsp.readdir('.')).filter(a => a.endsWith('.part'));
const ws = fs.createWriteStream('All parts');
for(const file of files){
await read(file);
await fsp.unlink(file);
}
ws.end();
})().catch(console.log);
@Hakerh400 , It rocks..Thank you very much.
@JeyanthVictory - looks like this issue can be closed now . let me know if that is not the case
Yes. It can be closed. Will get back to you, if anything needed. Thanks for the solution!!!
I have been facing a ton of troubles while converting the uploaded chunk files back in to the original file (file before uploading). I can get different chunks and save the same with .part extension. I am getting stuck how to merge the saved chunked files. Have a look at my code https://codeshare.io/Gb6Ojb Hope I get some solution. Thanks in advance!