w3c / webcrypto

The W3C Web Cryptography API
https://w3c.github.io/webcrypto/
Other
263 stars 53 forks source link

make the api string and base64 friendly #308

Closed r-jo closed 2 years ago

r-jo commented 2 years ago

hi, my use case is something like described here: https://www.w3.org/TR/WebCryptoAPI/#multifactor-authentication A symmetric hmac on server and client where I transmit the key to client and the signature to server in base64Url format...

I use the WebCrypto api to store the symmetric key as non-extractable CryptoKey object in indexed db... this way I can authenticate the browser as a trusted access point whenever is needed. Like when refreshing an access token since the crypto key is not extractable via js and transmitted only once.

The developer user experience is somewhat unfriendly since the api operates on js ArrayBuffers. While in java or other os languages it is ok to operate on byte[] since conversions are efficient and native libraries, js is not native and lacks easy/straightforward conversions. Even if now or in the future there is an easy way to convert b64(url)<->arrBuff on ecma script level, it would be better if we just had a richer WebCryptoApi interface, where we can also use inputs and get outputs as some js strings, base64 or base64Url strings (or even hex).

It would be really easy to solve this on browser code level (actually when importing from jwk, some of it is already solved). Please check out my added codes below where I illustrate how the api could work from a developer perspective and how it works now.

MY CONCRETE SUGGESTIONS: https://developer.mozilla.org/en-US/docs/Web/API/SubtleCrypto/importKey

https://developer.mozilla.org/en-US/docs/Web/API/SubtleCrypto/sign

In addition, the same (making it string and base64 friendly) could be done in the other api methods as well.

The current solution I came up with might be improved, might be wrong even though it seems to be working. What most important is, however, that now everybody has to come up with (and ship as js code) his/her/their good or bad js solution instead of a few lines of tested c++ or rust code in the browsers, written by people who really know what they are doing... not like me :)

WOULD BE GREAT to use WebCrypto like:

export class CryptoLib {
  static signMessageHmacSha(message, keyBase64Url, hashAlgo) {
    return new Promise(function (resolve, reject) {
      window.crypto.subtle.importKey('b64u', keyBase64Url, hashAlgo, false, ['sign']).then((importedKey) => {}
      window.crypto.subtle.sign('HMAC', importedKey, message, 'b64u')
                                       .then((signature) => {resolve(signature);}}

MY CURRENT VERSION:

export class CryptoLib {
    static signMessageHmacSha(message, keyBase64Url, hashAlgo) {  
    return new Promise(function (resolve, reject) {

    window.crypto.subtle.importKey('raw', CryptoLib._base64UrlToArrayBuffer(keyBase64Url), hashAlgo, false, ['sign']) 
    .then((importedKey) => {

                window.crypto.subtle.sign('HMAC', importedKey, new TextEncoder().encode(message)).then((signature) => {
                resolve(CryptoLib._arrayBufferToBase64Url(signature));

                }).catch((reason) => {
                    reject('problem with sign:', reason);
                });
            }).catch((reason) => {
                reject('problem with import:', reason);
            });
        });
    }

    //arrBuff->byte[]->biStr->b64->b64u
    static _arrayBufferToBase64Url(arrayBuffer) {
        let base64Url = window.btoa(String.fromCodePoint(...new Uint8Array(arrayBuffer)));
        base64Url = base64Url.replaceAll('+', '-');
        base64Url = base64Url.replaceAll('/', '_');

        return base64Url;
    }

    //b64u->b64->biStr->byte[]->arrBuff
    static _base64UrlToArrayBuffer(base64Url) {     
        let base64 = base64Url.replaceAll('-', '+');
        base64 = base64.replaceAll('_', '/');
        const binaryString = window.atob(base64);
        const length = binaryString.length;
        const bytes = new Uint8Array(length);
        for (let i = 0; i < length; i++) {
            bytes[i] = binaryString.charCodeAt(i);
        }

        return bytes.buffer;
    }
}
tniessen commented 2 years ago

in the importKey() method let keyData be some js string where we can indicate by the format parameter "b64" or "b64u" for base64 with +/ characters and base64Url with -_ characters.

Would that be base64-encoded "raw" data? It could also be base-64 encoded "pkcs8" or "spki" data. Which is essentially PEM at that point. If not using PEM, I don't think supporting just one specific string encoding (e.g., base64) is justified when there are many more that are commonly used (e.g., hex).

The "jwk" format is discouraged in the WebCrypto spec ("Support of "raw" key formats is encouraged for interoperability")

I don't think that's true. I believe the part you are quoting is saying that implementations of the WebCrypto API are encouraged to support "raw" formats, in addition to other formats such as "jwk", for interoperability with non-WebCrypto applications.

in the sign() method, let data be some js string too

Personally, I am not a fan of supporting strings in cryptographic APIs, especially as inputs to cryptographic primitives. I added some reasons to the Node.js documentation a while ago. Also, you'd quickly run into problems when your message is not UTF-8 encoded.

Even if now or in the future there is an easy way to convert b64(url)<->arrBuff on ecma script level, it would be better if we just had a richer WebCryptoApi interface, where we can also use inputs and get outputs as some js strings, base64 or base64Url strings (or even hex).

I think I personally disagree with this. There are some security benefits of doing string conversions within the runtime implementation and not in JavaScript, but I don't think they outweigh the added complexity.

r-jo commented 2 years ago

it is difficult to argue if you pick out some parts of my proposal/suggestion... I made a point how I would like to use the api, easy and clear, and how most of us are forced to use it now (see code examples above) if we are forced to convert things in js on the client (not node.js, node.js has more capabilities than js in the browser) plenty of code is needed and plenty of us are going to make mistakes

what is your problem if we can use b64, b64u, hex when we can already use jwk, which is redundant and only b64u? the browser can receive data from the server in a subset of ASCII text so someone has to make the conversions to array buffer. why is it better if every web programmer has to do its own js conversion library and ship it (or use some library and ship it) instead of just being available to utilize native, tested cpp or rust code in the browsers where top engineers made it once correct and tested it?

please argue for the big picture: why is 1. better than 2

  1. force every browser js (not node) programmer to convert base64 or hex or use jwk which is only base64url, redundant and has a very bad documentation (possible fields in 3 specifications, which field is a must, which field is a must for webcrypto api?)
  2. just do it natively in the browser with good cpp, rust libraries once and tested

the points you made

again, it adds NO complexity but usability and ease of use... jwk is complex and it is part of the api... why not hex, b64, b64u?

r-jo commented 2 years ago

I just checked the WebCryptography API. it was clearly intended for CLIENT side js, note for node.js, so please do not argue like node.js is comparable to go, java, c++ etc. where cryptography api-s may use byte[]... this was pushed by netflix and clearly for use cases like mine or to identify browsers that have a right to play netflix videos...

so please do not try to imitate c++ and co. with js... you are free to operate in array buffers in node (and if I am not mistaken in node you have better conversion libraries than atob, btoa)

I really see no downside to it, I could implement it in a weekend, tested in a week, the much better browser implementators in 2 hours and tested in a weekend...

the only question is whether plenty of people use the library with base 64 or hex codes... because if so, the api should be extended... I argue we are practically all forced to do the conversions since we transmit things in cookies in ASCII where we use a subset like base64 or hex of the 128 7 bit ASCII

it is not about node.js or whether node.js is so cool as c++ or java because it forces you to think in byte[] it is about client side js and API usability, what should be browser code and what should we write ourselves in js I argue that base64 - array buffer conversion should be browser code

r-jo commented 2 years ago

please google "js crypto base64 array buffer" or such and take a look at the stackoverflow threads... the mere exitence of such horrible discussion threads tells us something went wrong along the way... people have no idea of low level stuff, encoding-decoding and some of us might be able to figure out things in the end and even learn from it but most web programmers are not the best qualified as we all know, and especially making them/us to make horrible mistakes in crypto code is just a sin

panva commented 2 years ago

I also think the WebAPI desparately needs better support for base64/base64url/hex encoders/decoders in and out of arraybuffer/typedarrays. But I would say that is a general need and not something to slap onto the Web Cryptography API specifically.

r-jo commented 2 years ago

you are right in the sense that I would not have complained or made a proposal if conversions were a function call in the web api...

but what I see is that it would be really easy to add support on the web crypto api level and if jwk support is there, I dont see why we dont have or cannot have b64/hex support... jwk is way more complex, still added... more "native" would be b64/b64u/hex if we support anything else than raw type (bytes aka array buffers)

actually, if you use jwk format, you should read out with json tools the key in b64u format and other things like what is the use case of the key and use these values as input parameters for key import in web crypto api... instead, you give a jwk object to the api, also give the other parameters that may already be in the jwk string, and if something is missig or incompatible (say in jwk the key is only for signing but you give sign and verify in the other web crypto api parameters), it throws... this behavior is what I called redundant... and it has a smell

actually I think a nice clean api would be to support b64, b64u and hex but not jwk... but if jwk is supported, at least b64, b64u and hex should be supported as well

I find WebCrypto api good, but if it is easy it should offer more than the bare minimum... if there was only support for raw types and no support for jwk, I would understand more (not totally) that someone would argue to push the responsibility to client side js and web api...

I do not think it would harm the api if it could import/export from some common formats, even if web api becomes better... I just dont see the downside of it (especially with existing jwk support already) but I see tremendous benefits in the next few years and some benefit after client side js is on a better level in, say, 5 years

twiss commented 2 years ago

I agree that this is a pain point of the Web API, however, I also think the Web Crypto API is not the correct place to fix it, as @panva said.

In fact, there are some conversion functions in the Web API already, namely btoa and atob, even if they (very unfortunately) use the weird "binary string" concept rather than Uint8Arrays (due to predating them). This further indicates that that's where it belongs.

It would be better to propose adding a modern alternative to btoa and atob to the Web API, e.g. you could have Base64Encoder[Stream] and Base64Decoder[Stream], similar to the TextEncoder[Stream] and TextDecoder[Stream] APIs (but with the types reversed, i.e. base64 encoding goes from Uint8Array to string), rather than trying to shoehorn it into the Web Crypto API. Even if the latter seems "easier", I think you will have an easier time convincing the browsers to implement the former. The Web Incubator Community Group (WICG) would probably be the best place to discuss this, I think.