Closed GordianDziwis closed 5 years ago
I think you could run this download loop for the files in parallel. With something like this:
async function ParallelMapFlow(jobs) {
let results = jobs.map(async (job) => await doJob(job,job));
let finalResult = 0;
for (const result of results) {
finalResult += (await result);
}
console.log(finalResult);
}
ParallelMapFlow([1,2,3]);
ok need test for that if you have any make a pull request ^^ would be great
On my machine Ubuntu mat 18.04 lts the CPU usage is 40- 100% (on i3 8th gen 4ghz) memory usage is around 1GB. Given that the complete Ubuntu system boots at 600 MO ;) Of course this is 0.2.2, I believe theire is room for optimisation...
For coimparison GRIVE2 : 60MO // 1%CPU average on the same computer (c++ binary )
Fixed in 0.3.0
Download is about one file per second. The speed is not limited by bandwidth, bandwidth is just a few kb/s, except for the occasional bigger file. I am syncing about 75000 files with 16GB over a 400 MBit/s connection, this should be done in a few minutes, but with 1s/file it is about 20h.