cose-wg / CBOR-certificates

Other
11 stars 7 forks source link

Is the reduction of certificate size the killer feature of C509? #105

Closed ralienpp closed 1 year ago

ralienpp commented 1 year ago

After reading rev-05 of the draft, my understanding is that size reduction is the main benefit of switching to C509. This is outlined in the introduction, and to the best of my knowledge, no further advantages are mentioned, except maybe: "... removes the need for ASN.1 encoding, which is a rich source of security vulnerabilities".

I put on the devil's advocate hat, perhaps this will help shape the RFC in a way that builds a more convincing sales pitch.

  1. Are there other immediate benefits besides a lower size?
  2. I am not at all convinced by "ASN1 is a rich source of vulnerabilities". Unless CBOR is somehow intrinsically more secure by design, what makes us think that all CBOR parsers will be implemented without errors? While I admit that ASN.1 inflicted a lot of suffering unto me, I don't think it is judged fairly here.
  3. How future-proof is the C509 in the light of post-quantum crypto, where keys and signatures are much larger? (see pic at the bottom) A random key should not contain patterns, so it wouldn't be compressible; if the key takes the lion's share of the cert's size, wouldn't the benefits of C509 become less tangible?

Potential upsides: Is it possible to construct an argument that CBOR would require fewer computing resources to handle? For example, if one could claim that deserializing CBOR takes less RAM or fewer CPU cycles, or a shallower call stack... then it would be a strong reason in favour of using it on constrained IoT devices.

In addition, could one claim that CBOR lowers the cognitive burden on programmers? The built-in diagnostic representation is very useful, and one can easily convert it to JSON - so the existing skills would be easily transferable.

P.S. MITRE reports 137 ASN.1-related vulnerabilities, ~2200 for XML and 872 for JSON. CBOR has only 5 entries, but relatively speaking - it is the "new kid on the block", whereas ASN.1 has been around for almost 40 years and is very widely used.

Re: key sizes: here's an excerpt from Wikipedia's PQcrypto page: image

highlunder commented 1 year ago

First of all, a big thank you for your questions and observations! We should definitely first be able to motivate the work well enough to those that already have an interest in the area if we should be able to reach a wider audience.

There are additional other aspects to it, but my reasoning is the following;

I agree that the byte savings per individual certificate is not a strong enough motivation on its own. A wider perspective is that moving to a CBOR encoded format is in line with the other newer IoT standards, such as COSE and those that use it, for instance, EDHOC and OSCORE. For a ‘powerful enough’ IoT device it is probably ok to keep multiple libraries around to do similar tasks and hence be able to handle both CBOR and ASN.1. But being able to trim the library dependencies to a minimal usage profile is part of why I think moving to CBOR based encodings makes sense.

+There are ongoing discussions on how to incorporate the C509 format into both enrollment and revocation operations for IoT, which are currently more heavy-weight operations that are not easily trimmed down if one needs to keep full backward X.509 compatibility.

In a sense, the above arguments are in line with your ‘lowers the cognitive burden on programmers’ observation. + The fewer libraries and dependencies, the lower the cost to do a more strict security analysis.

About the ‘potentially fewer computing resources to process’; except for the special case of ECC public key compression; this will continue to be dependent on the signature algorithms used, handled by the crypto libs considered outside of the certificate format itself. As long as both X.509 and C509 are capable of specifying and carrying the ‘most efficient for IoT devices to validate’-signatures, I don’t see any difference in this area.

About the need for longer keys, and especially longer signatures to be quantum safe(r); Yes, with larger key+signature data to handle, on one hand, the extra cert data is relatively smaller, but that shouldn’t mean we ignore it. To further complicate things, for the hash-based algorithms / at least one of the leading PQC candidate signature algorithms HSS/LMS, they “can only be used for a fixed number of signing operations”. (https://www.rfc-editor.org/rfc/rfc8778.html) → We might end up needing to do more frequent certificate renewals. This can be seen as an argument either for or against caring about the relatively small additional cost of the other cert data. Having said this, while it’s definitely time to start planning for PQCs, I think we have some more years to efficiently handle the system and algorithms we use today.

Once again, thank you for the observations! I hope we, through our continued work, will be able to make the potential advantages sufficiently clear.

+For the interested, some further PQC reading:

"An Update on NIST’s Post-Quantum Crypto Standardization": https://x9.org/wp-content/uploads/2020/02/PQC-update-2020.pdf

"LMS vs XMSS: Comparion of two Hash-Based Signature Standards" https://eprint.iacr.org/2017/349.pdf

highlunder commented 1 year ago

+We have discussions in the text on the efficiency of cbor, processing and small code size.