Closed galaxy001 closed 1 year ago
Hello @galaxy001
Since your report, all operations have seen their hardware requirement lowered With the implementation of batching, 4GB of memory should be able to process all kind of dataset
While there are more improvement to implement, could you check the latest code and reopen this if you still have the issue ?
Best regards,
I need to run whole volume dedup on a NAS with 4G memory. However, the database file is 18 G now. Which makes the 2nd step using swapfile and be to slow to complete.
Would you offer an option to further separate
Find-dupes
, to enable it run based only on the database file, and output a list of candidate file pairs. Maybe in thefdupes
format.Then, I can copy the list to my NAS, and do step 3.