adrianlopezroche / fdupes

FDUPES is a program for identifying or deleting duplicate files residing within specified directories.
2.42k stars 186 forks source link

Merge n groups of hardlinked files to a single hardlinked file #49

Open sandrotosi opened 8 years ago

sandrotosi commented 8 years ago

From @sandrotosi on December 20, 2015 14:5

From warpx...@gmail.com on October 23, 2011 21:55:19

What steps will reproduce the problem? 1. fdupes -H -L 2. 3. What is the expected output? What do you see instead? fdupes does what I asked.

Instead: fdupes: options --linkhard and --hardlinks are not compatible What version of the product are you using? On what operating system? fdupes 1.50-PR2 on Debian Squeeze a64 Please provide any additional information below. The only way to achieve the desired result is to run fdupes --linkhard multiple times (until it stops merging hardlinked groups of files)

This is impractical for large directories.

For example: if I run mkdir -p test/a test/b echo "blah" >test/a/a echo "blah" >test/b/a ln test/a/a test/a/b ln test/b/a test/b/b

now a/a and a/b share 1 inode and b/a, b/b share one inode

fdupes -r -L test

now a/a a/b and b/b share 1 inode but b/a is all by itself

fdupes -r -L test

now they all share 1 inode, the desired result

Original issue: http://code.google.com/p/fdupes/issues/detail?id=22

Copied from original issue: sandrotosi/fdupes-issues#8

sandrotosi commented 8 years ago

From fszcze...@gmail.com on October 29, 2011 19:00:41

--linkhard is a vendor addition. This bug should be reported to Debian.

I have a patch to do hardlinks that handles this case correctly. I will be posting it on issue #8 as soon as I complete testing.

berlincount commented 8 years ago

The opposite should also be implemented.

When linking very very many files to each other, you get "unable to create a hardlink for the file: Too many links" .. a second "group" should be opened then, I'd say.