sipa / bips

Bitcoin Improvement Proposals
bitcoin.org
145 stars 43 forks source link

Squareness vs oddness tie-breaker for public keys #191

Closed real-or-random closed 4 years ago

real-or-random commented 4 years ago

Splitting discussions out. This one about the performance impact on signing, without precomputed public key data.

The current PR does one unnecessary multiplication, as it's unnecessarily computing the affine Y coordinate. After fixing that, the speed gain from going from square to even tiebreaker is trading 1 field multiplication and normalization for 1 jacobi computation - which is dominated by the jacobi symbol.

I don't know if we care about this ~5us speedup at signing time (because when you do care, perhaps you want signing with precomputed public key data, which avoids this cost in both cases), but there seems to be basically no gain apart from consistency with R (which is arguably very different, as handling of R is generally limited in scope to the signing code, while public keys are obviously exposed to applications (and things like the taproot tweak break the perfect blackboxy ness of them).

Originally posted by @sipa in https://github.com/bitcoin-core/secp256k1/pull/558

real-or-random commented 4 years ago

I don't know if we care about this ~5us speedup at signing time (because when you do care, perhaps you want signing with precomputed public key data, which avoids this cost in both cases), but there seems to be basically no gain apart from consistency with R (which is arguably very different, as handling of R is generally limited in scope to the signing code, while public keys are obviously exposed to applications (and things like the taproot tweak break the perfect blackboxy ness of them).

I think the question is not so much whether we care. We care about the details, and here it seems to me that switching to oddness is faster and it's basically for free because there is no point in consistency.

sipa commented 4 years ago

@roconnor-blockstream pointed on IRC that the even-tiebreaker also adds a normalization step to the verifier.

gmaxwell commented 4 years ago

careful with too much public key consistency. That's how you people blowing their security by using the same key/nonce with ecdsa and schnorr, or burning their coins by writing uncompressed pubkeys into the output. :P

I don't think this one matters too much one way or another. It's nice if the flagged public key can just be an ordinary 'compressed' public key.

Looks like 'oddness' (yuck) for the pubkey would just plain be faster because jacobi is slower than a multiply, unlike for r and we get the square and inverse for free as a product of the x projection to affine. Even if you assume that the pubkey is going to be precomputed in signing (as it should be), that precomputation would still be faster.

@sipa and half a negate.

sipa commented 4 years ago

So, I think we should change BIP340 to use implicit-even as tiebreaker. To avoid inconsistently breaking potential code people may have written already, we should probably also change the challenge hash tag. How about "Schnorr/secp256k1" or so?

sipa commented 4 years ago

This also means changing BIP341 to change the flag in the control block to be the odd/even bit of Q, rather than its squaredness.

Implementation-wise, I wonder if this means we can't just drop a bunch of the x-only code, and just use the existing pubkey tweaking functions in the consensus verifier.

jonasnick commented 4 years ago

I share the slight preference for implicit-even as tiebreaker because it simplifies the scheme (by avoiding the jacobi symbol), despite the additional verification branch. Also, implementation-wise this makes the full/auxiliary pubkey type identical to the existing one and there's no need for a tiebreaker flag.

As for compatibility between auxiliary/flagged and compressed pubkeys, it seems like compressed pubkeys are already compatible with xonly pubkeys. An adversary can easily convert between the two. At least I'm not able to come up with a non-adverserial scenario or practical attack where this matters.

ajtowns commented 4 years ago

I think the only reason squareness was chosen over evenness was because we'd already specced has_square_y(P) and P = lift_x(x) conversion functions for dealing with R. Switching to evenness sounds like a good tradeoff if it makes implementation/use a bit easier.

sipa commented 4 years ago

Fixed by #192.