Closed Wind4Greg closed 1 year ago
Hey Greg.
Could you try running verification with options zip215: false
?
Hmm, I'm not seeing the difference. I can see your RFC8032 checks in https://github.com/paulmillr/noble-curves/blob/9bee88888f9bbd596605d05d83c5e7a427752bc2/src/abstract/edwards.ts#L358-L372
The cases (8-11) are for non-canonical A and R values which would have been caught.
Am I setting the option wrong? I'm using:
let result = await ed.verify(signature, message, pub_key, {zip215: false});
Thanks Greg
So, we don't have binding signatures. We have more important unforgeability, however.
The question is how binding signatures are useful in practical context. I mean, if RFC8032, NIST and ZIP215 never mentioned it, there should be some reason.
Default noble behavior is ZIP215, which is important for blockchain consensus. With zip215: false we fall back to RFC8032 / NIST behavior, which reject A >= P. BOTH zip215 and RFC8032 ask for SUF-CMA.
Hi Paul, yes with cases 6 and 7 we get SUF-CMA. In their presentation to NIST https://csrc.nist.gov/csrc/media/Presentations/2023/crclub-2023-03-08/images-media/20230308-crypto-club-slides--taming-the-many-EdDSAs.pdf they give more motivation for SBS (Strongly Binding signature). They cite a more recent block chain attack on Monero. I'm interested in verifiable credentials so like SBS.
However, before going there it seems that cases 8-11 should be caught by the "negative zero x test" of https://www.rfc-editor.org/rfc/rfc8032.html#section-5.1.3. I'm not sure how this works in your code and why its not failing these cases. The A or R (x, y) values are $(−0, 2^{255} − 20)$ in these cases.
For SBS they say: "Reject small order A (line #2, Alg. 2): This check makes the scheme strongly binding (SBS-secure, see Definition 1 in Section 2)". There are only a small number of "small order A" points in both canonical and non-canonical serializations. So I was crudely checking against these strings for "bad public keys":
const smallOrderPoints = [
"0100000000000000000000000000000000000000000000000000000000000000",
"ECFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF7F",
"0000000000000000000000000000000000000000000000000000000000000080",
"0000000000000000000000000000000000000000000000000000000000000000",
"C7176A703D4DD84FBA3C0B760D10670F2A2053FA2C39CCC64EC7FD7792AC037A",
"C7176A703D4DD84FBA3C0B760D10670F2A2053FA2C39CCC64EC7FD7792AC03FA",
"26E8958FC2B227B045C3F489F2EF98F0D5DFAC05D3C63339B13802886D53FC05",
"26E8958FC2B227B045C3F489F2EF98F0D5DFAC05D3C63339B13802886D53FC85",
"0100000000000000000000000000000000000000000000000000000000000080",
"ECFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF",
"EEFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF7F",
"EEFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF",
"EDFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF",
"EDFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF7F"
];
cases 8-11 should be caught by the "negative zero x test" of https://www.rfc-editor.org/rfc/rfc8032.html#section-5.1.3
Which test? I don't see any negative zero x test there.
There is no "negative zero" in bigints. There is just 0n.
Somehow the x=0 and x_0 = 1
check was missed. Adding it for zip215: false
keeps the tests passing. Guess we'll need to add it.
vectors 2-5 are still failing even after your suggested test against smallOrderPoints, so i'm guessing the check is not full.
A= c7176a703d4dd84fba3c0b760d10670f2a2053fa2c39ccc64ec7fd7792ac03fa
0 false
A= c7176a703d4dd84fba3c0b760d10670f2a2053fa2c39ccc64ec7fd7792ac03fa
1 false
A= f7badec5b8abeaf699583992219b7b223f1df3fbbea919844e3f7c554a43dd43
2 true
A= cdb267ce40c5cd45306fa5d2f29731459387dbf9eb933b7bd5aed9a765b88d4d
3 true
A= cdb267ce40c5cd45306fa5d2f29731459387dbf9eb933b7bd5aed9a765b88d4d
4 true
A= cdb267ce40c5cd45306fa5d2f29731459387dbf9eb933b7bd5aed9a765b88d4d
5 true
A= 442aad9f089ad9e14647b1ef9099a1ff4798d78589e66f28eca69c11f582a623
6 false
A= 442aad9f089ad9e14647b1ef9099a1ff4798d78589e66f28eca69c11f582a623
7 false
A= f7badec5b8abeaf699583992219b7b223f1df3fbbea919844e3f7c554a43dd43
8 false
A= f7badec5b8abeaf699583992219b7b223f1df3fbbea919844e3f7c554a43dd43
9 false
A= ecffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
10 false
A= ecffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
11 false
OK, finally understood the paper. All the checks can simply be reduced to two lines of code. There is no need to check for a list of points etc.
Here's the function that you'll need (after my change to x_0 fromHex):
function verifyStrict(sig, msg, publicKey) {
let A;
try {
A = Point.fromHex(publicKey, zip215);
} catch (error) {
return false;
}
if (A.isSmallOrder()) return false; // points 0-1
if (!A.isTorsionFree()) return false; // points 2-5
return ed25519.verify(sig, msg, publicKey, {zip215: false});
}
Awesome! Yes, cases 2-5 won't be caught but don't affect SBS or SUF properties. I figured you would have a better way than checking the list of points. I'm thinking of pushing for this stronger verification in the verifiable credential specification that useds Ed25519 signatures. Any opinion?
I'll try out the code tomorrow. Cheers Greg
I'm thinking of pushing for this stronger verification in the verifiable credential specification
Makes sense to me. I'm not sure:
Benchmark says isSmallOrder() check is safe and doesn't degrade perf. So, good-to-go in the default mode. The only remaining concern is backwards-compatibility.
To reaffirm, if 0-1 and 6-11 all return false now, SUF and SBS all seem to be in place? We don't need 2-5 and expensive isTorsionFree, do we?
Done, just use zip215: false
now.
Confirmed from my reading of the paper don't need the torsion free check. I try it tomorrow morning PDT (pacific daylight time). Thanks a ton!
Independently confirmed the verification tests against main. Thanks a bunch. One last question: when will these changes get rolled up into an npm release?
Cheers Greg
The goal is to have non-critical releases once per 4-6 weeks or so. Last one was 3w ago. So, we're close.
noble-curves are highly dependent upon, so it takes non-trivial amount of work to upgrade all dependents. Can't do that too often.
Sounds great! Closing the issue. There is some new BBS with binding work that will use your BLS signatures, BLS12-381 curve and my BBS (based on your BLS12-381 curve) that I'll be starting in a couple weeks.
Hi Paul very much enjoying @noble/curves for lots of my elliptic curve needs. Most recently I've been reviewing/implementing EdDSA signatures based on Ed25519.
The 2020 paper Taming the many EdDSAs, explains that with some extra validity checks (beyond those in RFC8032) that EdDSA signatures based on Ed25519 can be proven:
They authors test a number of Ed25519 libraries as of 2020 and find them missing many of the required checks. They give their short set of test vectors at https://github.com/novifinancial/ed25519-speccheck.
When I run their test vectors against @noble/curves Ed25519 signature verification I get the results shown below. Note that all signatures should be invalid (false), the first column are the results of the recommended checks, the second is what noble/curves currently catches as invalid, the others are looking for specific known bad key and R values.
In an updated presentation to NIST the authors mentioned that a recent version of Bouncy Castle (Java crypto library) has been updated (2023).
Can we get these additional verification/validation checks for Ed25519 EdDSA signatures incorporated into noble/curves? Would you like assistance with this?
Best Regards Greg
References