Open bonesoul opened 7 years ago
So this may only be my case, but the result is a .tar
file that's almost 3x as the original dump folder (small database)...
No option to add gzip
?
I tried tar -czf xxx.tar.gz xxx
on non-tar dump folder, which would create a compressed file that's only 1/7 (dump folder 7mb, tar.gz 1mb).
I guess the reason that tar is bigger than dump folder because the database isn't that big yet, so the overhead of tar takes a big chunk. I don't have a larger database to test yet.
I'm finding the tar file 3 times as big in all cases (0 records and 200k records).
I have the same question I found that the tar files are large. Adding the option to gzip would be great
Here is the workaround for you.
How to make .tar.gz backup:
const backup = require('mongodb-backup');
const {exec} = require('child_process');
const tarFileName = 'backup.tar';
backup({
uri: 'mongodb://localhost:27017/test',
root: __dirname,
tar: tarFileName,
callback: () => {
exec(`cd ${__dirname} && gzip -9 ${tarFileName}`, () => {
console.log('Backup file has been created');
});
}
});
How to restore .tar.gz backup:
const restore = require('mongodb-restore');
const {exec} = require('child_process');
const tarFileName = 'backup.tar';
exec(`
cd ${__dirname}
gzip -d ${tarFileName}.gz
`, () => {
restore({
uri: 'mongodb://localhost:27017/test',
root: __dirname,
tar: tarFileName,
callback: () => {
console.log('Restored');
}
});
});
hi, have you seen https://github.com/hex7c0/mongodb-backup/blob/1.6/examples/tar.js ?