Closed LoupVaillant closed 1 year ago
Further reading:
The entire space is kind of chaotic and apparently no two libraries agree on anything except the most strict checks. It looks like libsodium got more strict and actually caused breakage in some cryptocurrency in doing so. From an end-user perspective (as much as it is inconvenient to depend on someone else's upstream), it would be at least useful to do something any other library does. The spec leaving this much room seems to have ended up being an actual issue.
Okay, I should have read that piece from Henry de Valance sooner (I knew about it , but was too lazy to find it).
Turns out I was wrong on 2 counts:
Reading de Valence’s article, I see that my points (1), (2), and (3) above are fairly consensual:
Divergence & choice then boils down to 3 criteria:
For our purposes, I believe we must use the batch equation. Yes, it comes at an initial cost (one point addition and 3 points doublings, or perhaps just the addition if we’re clever with comparisons), but the potential optimisation (both Pornin’s and genuine batch optimisation) is in my opinion to important to pass up.
Then there’s whether we should reject low-order A. Libsodium does. The RFC doesn’t. Great.
Finally, the RFC mandates canonical encodings for A and R. Monocypher currently mandates canonical R by accident, as do many other libraries, but that comes from our use of the strict equation. And dammit, explicitly checking for canonical encodings is cumbersome.
Overall, I see 3 option:
I think this is the death knell for the RFC: nobody actually follows it (Bouncy castle does, but it uses the strict equation). This leaves us with libsodium and Zebra. The way I see it, the winner here would be Zebra: allowing non-canonical encodings and low-order keys has absolutely zero impact on security, it’s simpler, and marginally faster. And if compatibility is an issue, it’s probably better to start accepting signatures that were previously rejected, than start rejecting signatures that were previously accepted.
Unless there’s an objection, I think it is best to go for the maximally permissive option (3).
We still need the test vectors to make sure we're Zebra compatible.
EdDSA signatures are finicky. The basic properties are there, and pretty much every libraries agree on the following:
Some other details however can cause libraries to disagree about the validity of a signature, and this has caused problems in existing systems & protocols (hoping I’m not forgetting anything):
Currently, Monocypher:
According to section 3.4 of RFC 8032, Monocypher complies with points (1), (2), and (3). The carefully paranoid reader will note that the current code does not include explicit check about R, but the final comparison can only fail if R is not on the curve: that’s because [S]B - [h]A definitely is on the curve, and when serialised will necessarily be different from any R that isn’t.
We do however have a problem with respect to point (4), and I wanted to take the opportunity of the upcoming major version change to correct that. See, we’re a little more strict here than we need to be: with the simplified equation we’re using, if R or A (or both) have a low-order component, that low-order component isn’t supposed to influence the final result. We’re supposed to project everything to the prime-order subgroup and then verify the signature. Or verify the official equation, this has the same effect. Complying here will bring 2 advantages:
TODO: