stanimirivanovde / browser-power

A showcase of client side javascript that can run in the browser
MIT License
2 stars 0 forks source link

use web crypto instead? #1

Open jimmywarting opened 3 years ago

jimmywarting commented 3 years ago

Why involve node stuff in the browser? (aka: node-forge) can't you achieve the same thing using web crypto

I bet it have better performance (cuz it's native). I also think you should be using readAsArrayBuffer and not readAsBinaryString.

The use of readAsArrayBuffer() is preferred over readAsBinaryString(), which is provided for backwards compatibility.

technically you could then just do blob.arrayBuffer() if you like promises... or new Response(blob).arrayBuffer() and screw the hole filereader

stanimirivanovde commented 3 years ago

First of all thanks for looking into the code @jimmywarting! I appreciate any tips and advice.

The reason why I decided to go with forge was the requirement to process data in chunks in order to achieve constant memory operation. The Web Crypto API as I looked at the SubtleCrypto Implementation does not provide support for encrypting a file in chunks. It operates on the whole blob of data. This is unfortunately not what I need. And it is sad because I'm pretty sure it will provide the best performance out there.

My requirement is to have encrypt(), decrypt() and hash() functions that can operate on a chunk of data. forge has an update() method that is perfect for this. I did try CryptoJS and sjcl.js but both of them were slower than Forge.

I played around with readAsArrayBuffer() and readAsBinaryString() but my findings were that readAsBinaryString() causes encryption/decryption/hashing to run twice as fast. I followed the code for forge.util.createBuffer() and it looks like it does extra array copy if the input is not a binary string. This is done when it creates a ByteStringBuffer(b) which unfortunately negatively impacted performance. It took longer to copy the ArrayBuffer to the destination buffer than it took to encrypt/decrypt/hash the data 😞 . This might be a bug in forge itself.

The way I understand it is that I can't get away from not using forge.util.createBuffer() because all operations require the data to be in that format already.

jimmywarting commented 3 years ago

I see, well, have a look at this two:

stanimirivanovde commented 3 years ago

👍 was given to the webcrypto project

The SO suggestion is rather complex for Authenticated Encryption AEAD cipher such as AES-GCM.

What he suggests is to chunk the data as 1MiB encrypted blobs with their own authenticated tag but then at the end run another encryption of all the authenticated tags as additional authenticated data in order to prevent reordering attack. I'm not sure if all this work is necessary if the forge library already handles the case. 9MB/s is not too bad of performance while we wait for proper native implementation.