soarecostin / file-vault

A Laravel package for encrypting and decrypting files of any size
MIT License
185 stars 62 forks source link

Decryption error when S3 returns last block with wrong size #19

Open gbalduzzi opened 2 years ago

gbalduzzi commented 2 years ago

Hi @soarecostin , thank you for the great library.

I noticed in my app that sometimes I had decryption errors on files from s3, but trying the same operation on the same file some seconds later would suggest.

I noticed that the FileEncrypter has a check in place to handle different chunk sizes received from s3:

// Because Amazon S3 will randomly return smaller sized chunks:
// Check if the size read from the stream is different than the requested chunk size
// In this scenario, request the chunk again, unless this is the last chunk
if (strlen($ciphertext) !== 16 * (self::FILE_ENCRYPTION_BLOCKS + 1)
    && $i + 1 < $numberOfChunks
) {
    fseek($fpIn, 16 + 16 * (self::FILE_ENCRYPTION_BLOCKS + 1) * $i);
    continue;
}

I added some custom logs and noticed that the bug always happens while trying to decode the last block. Could it happen when S3 return the last chunk with a wrong size? There doesn't appear to be any checks in place to detect (and fix) such an occurrance.

I will try to fix the issue and submit a pull requests in case of success

drewwbctz commented 2 years ago

I was running into the same issue when decrypting files from S3. My app would randomly return a "Decryption failed" error. It didn't always happen, and it seemed to be fairly random. In my case, I was able to fix the issue by adding a check for $plaintext in addition to the chunk size, like this:

        // We have to read one block more for decrypting than for encrypting because of the initialization vector
        $ciphertext = fread($fpIn, 16 * (self::FILE_ENCRYPTION_BLOCKS + 1));
        $plaintext = openssl_decrypt($ciphertext, $this->cipher, $this->key, OPENSSL_RAW_DATA, $iv);

        // Because Amazon S3 will randomly return smaller sized chunks:
        // Check if the size read from the stream is different than the requested chunk size
        // In this scenario, request the chunk again, unless this is the last chunk
        if (strlen($plaintext) !== 16 * self::FILE_ENCRYPTION_BLOCKS
            && $i + 1 < $numberOfChunks
            || $plaintext === false // <-ADDED THIS CHECK TO SEE IF PLAINTEXT IS FALSE
        ) {
            fseek($fpIn, 16 * self::FILE_ENCRYPTION_BLOCKS * $i);
            continue;
        }

After adding the additional check, it seems to unencrypt the file(s) without issue each time.

gbalduzzi commented 2 years ago

Thanks @drewwbctz , the only problem with that check is that in case of a different encryption failure (such a wrong key or a malformed file) the code would result in an infinite loop if I am not mistaken.

I managed to fix it into my codebase by checking the total file size, I will try to submit a pull request as soon as I can