birkenfeld / fddf

Fast data dupe finder
Apache License 2.0
109 stars 9 forks source link

Too many open files error #2

Closed rekka closed 7 years ago

rekka commented 7 years ago

Hi, thanks for a nice tool! This is definitely something I can use.

I installed the 1.1.0 version via cargo install. However, when I tried to run it on my home dir, it blew up the stack:

thread '<unnamed>' panicked at 'called `Result::unwrap()` on an `Err` value: Error { repr: Os { code: 24, message: "Too many open files" } }', /checkout/src/libcore/result.rs:859
thread '<unnamed>' panicked at 'called `Result::unwrap()` on an `Err` value: Error { repr: Os { code: 24, message: "Too many open files" } }', /checkout/src/libcore/result.rs:859
thread '<unnamed>' panicked at 'called `Result::unwrap()` on an `Err` value: Error { repr: Os { code: 24, message: "Too many open files" } }', /checkout/src/libcore/result.rs:859
...
100s more
...
thread '<unknown>' has overflowed its stack
fatal runtime error: stack overflow
[1]    3030 abort (core dumped)  fddf .
birkenfeld commented 7 years ago

Thanks for the report. I can reproduce this, and will check how to best fix it.

birkenfeld commented 7 years ago

Ok, this should now be fixed in 4307685b4dedc32f81edb17783480c07b7e89599

(Also added a bunch of features.)

Let me know if it works for you now. My ~200G homedir with lots of dupes is processed fine now in about 5min.

rekka commented 7 years ago

Neat, thanks a lot! Works like a charm.