cabforum / servercert

Repository for the CA/Browser Forum Server Certificate Chartered Working Group
https://cabforum.org/working-groups/scwg/
161 stars 104 forks source link

Make it possible to validate conformance of certificate serial number requirements #558

Open primetomas opened 1 week ago

primetomas commented 1 week ago

TLS BRs in section 7 (for all certificate types) contain requirement for certificate serial number. "MUST be a non‐sequential number greater than zero (0) and less than 2¹⁵⁹ containing at least 64 bits of output from a CSPRNG"

Several sources state that this requirement is not possible to validate automatically.

  1. The original 63 bit of entropy discussion
  2. PR 857 to zlint that was rejected because the conclusion is that it is not possible to validate.

For the new linting requirements on issued certificates if would be highly beneficial if most, if not all, BRs are possible to automatically verify.

As the serial number requirement is not possible to verify automatically, using a linter, this requirement should be rewritten to something verifiable.

github-actions[bot] commented 1 week ago

This issue was created based on:

timfromdigicert commented 1 week ago

Unfortunately, as much as I agree with the sentiment, I think this should be closed as invalid. Here's why.

  1. It's perfectly reasonable to have a requirement that certificate serial numbers be random and unpredictable, in a cryptographically relevant way.
  2. As noted in the discussions, the properties of cryptographically secure randomness and unpredictableness are provably impossible to verify via a post-hoc examination of a single instance.

I realize some people want to fire all the auditors and only have requirements that can be verified by technical checks, but that isn't possible for some perfectly reasonable requirements, and I think this is one of them.

Unless this ticket has a concrete proposed solution, I'm afraid it's going to become an unproductive rathole.

primetomas commented 1 week ago

My proposed solution is to state a minimum number of bytes in the serial number. That would make it a lot harder to implement "wrong", unless willfully deceptive, and make it easy to verify.

timfromdigicert commented 1 week ago

I believe that is the way it was many years ago, before we fixed it to actually require good randomness. Someone should go back and look at the history and check.

I personally don't want to go back to a situation where rand() is a compliant way to generate certificate serial numbers. We spent quite a bit of time improving this requirement, and I'd rather not go backwards.

dzacharo commented 1 week ago

I agree with @primetomas . We do need to have auditable requirements as much as possible. We can achieve both the randomness requirement of 64-bits and also have minimum expectation on the size of a serial number. Theoretically, a true random generator could result in a number that is below a set theshold, which will need to be discarded and a new random number will need to be created. Some CAs decided to use much larger serial numbers than just 64 bits, to make sure they get good and strong serial numbers.

I think it should be an easy ballot, and will do justice for the CAs that had to revoke large numbers of certificates because of providing 63-bits of entropy in the past :-)

timfromdigicert commented 1 week ago

No, Dimitris, that's not how random numbers work. If you implement this random function:

int randomnumber(int max) { int result; do { result = realrandom(max); } while (result < max / 1000); // reject small, rare results, because they look suspicious. }

You're not making the random numbers better, you're just biasing them. The dice need to be unloaded.

A cryptographic random number generator MUST generate small numbers with the same probability as large numbers. This means that 1/256 of them have a first byte that's all zeros, and 1/65536 have two leading bytes that are all zeros. That's the correct behavior.

If a random number generator emits 42 three times in a row, you have a right to be suspicious, but with low probability, it will happen. Of course, if MAX is in the cryptographically-relevant range, you can read the monkey's Shakespeare sequel while you wait, but it is not 100% proof of brokenness.

Entropy cannot be measured based on a post-hoc measurements of a single instance. That's how entropy works, it's part of the definition. Entropy is about the statistical properties of an ensemble, not an single instance. It's why single particles don't have temperatures, etc.

barrini commented 1 week ago

Thomas, still don´t get it. If your suggestion is to have a minimum number of bytes then it´s not that random. I agree with Tim but not sure about your proposal.

dzacharo commented 1 week ago

Not sure about Thomas but my suggestion would be random, as we do it today, and have a minimum size of bytes in the SN.

@timfromdigicert I get the theory. We have a practical problem to solve and if dropping smaller numbers is an acceptable risk which yet fits the risk profile of the unpredictability of a serial number, so be it. It will solve the problem of auditability.

primetomas commented 1 week ago

Exactly, minimum number of bytes, but still with entropy requirements.

I took a look at a bunch of public CAs and currently most of them use serialnumbers of 16 bytes or more. Only 1 I found used 8 bytes or less. There are many good ways to create this without messing up the entropy. With the 16 bytes serial, creating numbers below 2^64 is quite unlikely and it will not hurt you much to discard those. Some CAs front-pad their serial numbers with fixed octets, putting all the entropy on the back-end of the serial.

It would be a method to make the CA aware of a possible flaw before it hits them and the absolute majority of CAs today would be ok, unless they introduce a bug in which case this lint would alert them.

Sec-Wayne commented 1 week ago

I entirely agree with @primetomas' approach to just change the BRs.

I did research into this many months ago and spoke to a few Root Program representatives and CAs. A common sticking point throughout discussions historically is the difference between generation of the number, and what ultimately is placed in the cert when it is encoded via ASN.1. Never state the e word.

Rehashing some of my statements to those parties below:

The core intent is pre-image resistance by putting more random values at the start, so while I understand DER encoding to reduce it to 02 01 01 is strictly correct compared to 02 08 00 00 00 00 00 00 00 01 it kinda defeats the purpose.

This is from ASN.1 where 01 and 08 and specifically encoded lengths that compress values. CAs have been generally working around this shortcoming by including prefixes at the beginning to avoid the issue entirely.

I will note that there is a subtle difference between the BRs and MRSP. This was noted by Apple:

https://groups.google.com/g/mozilla.dev.security.policy/c/LfcGEnpfV1M/m/DFM-WL6VCAAJ https://bugzilla.mozilla.org/show_bug.cgi?id=1533655

2016-09-30 - CA/Browser Forum requires the use of 64 bit entropy when generating serial numbers [2]. 2017-02-28 - The Mozilla Root Store Policy requires that certificates must have a serial number greater than zero (0) containing at least 64 bits of output from a CSPRNG [3].

Two different points here: 'output', but also generation vs in the certificate itself. Before even attempting a conversation on this subject make sure everyone is very clear on the definition or everyone will talk past one another.

However yes, there is only a single CA left who is actively generating certificates at concerning thresholds. For those with censys access:

(labels="precert" and validation.nss.has_trusted_path=true and not labels="revoked") and parsed.serial_number_hex: /.{0,16}/

We've been generally moving to consensus-compliance where if a majority of parties are reading the rules one way and there's a single outlier, they're de-facto out of compliance. Not that this would necessarily generate an incident, but it would make it clear that some clarification is required to make sure everyone understands the bare minimum.