AgentD / squashfs-tools-ng

A new set of tools and libraries for working with SquashFS images
Other
194 stars 30 forks source link

Resolving hardlink: No such file or directory #64

Closed iScriptLex closed 3 years ago

iScriptLex commented 3 years ago

When length of the hardlink target is exactly 100 characters, tar2sqfs can't process such link and falls with an error "No such file or directory". This error appears during the postprocessing stage. So, you can archive more than 100Gb of data and only after this long process it will drop... I created a simple example of such tar file (see attachment) to reproduce this bug. err_repr.tar.gz You can do this:

gzip -dc err_repr.tar.gz | tar2sqfs tst.sqfs

and error appears:

Resolving hard link '/20CharsForLnkTest001/20CharsForLnkTest002/20CharsForLnkTest003/20CharsForLnkTest004/01234567890123456789' -> '20_characters_here01/20_characters_here02/20_characters_here03/20_characters_here04/errored_file_tstustar ': No such file or directory

tar unpacks this archive without any troubles.

Upd: I found some workaround for dirty fixing of this bug. Open file lib/tar/read_header.c. Find this if-block in the decode_header function:

if (!(set_by_pax & PAX_SLINK_TARGET)) {
    out->link_target = strdup(hdr->linkname);
    if (out->link_target == NULL) {
        perror("decoding symlink target");
        return -1;
    }

and add one code line after this, so it should be now:

if (!(set_by_pax & PAX_SLINK_TARGET)) {
    out->link_target = strdup(hdr->linkname);
    if (out->link_target == NULL) {
        perror("decoding symlink target");
        return -1;
    }
    if (strlen(hdr->linkname) >= 100) out->link_target[100]='\0';

After compiling, it works smoothly.

AgentD commented 3 years ago

Hi,

thanks for pointing this out!

The problem here is that the tar header field for (sym|hard) link targets is exactly 100 bytes in size and it is perfectly valid to completely fill it without leaving space for a null-terminator, which currently results in data of the adjacent field to be appended to the link target.

The preferable fix for the problem would be replacing the strdup with an strndup that copies at most sizeof(hdr->linkname) and adds a null-terminator if there is none.

With this patch I can process the example tarball you uploaded:

0001-Fix-libtar-treatment-of-link-targets-that-fill-the-h.patch.gz

iScriptLex commented 3 years ago

Thank you. I tested your fix on a complex directories structure which contains more than 11000 links. It was processed correctly.