Closed eric1894 closed 6 years ago
Am 18.11.2015 um 13:48 schrieb eric1894:
The following page gives a list of safe curves to use with ECC:
As far as I can see, Crypto++ doesn't currently implement any of the safe curves listed on that page (e.g. Curve25519)
We have Curve25519 (and similar) curve support on our TO-DO list and hope to ship the support with Crypto++ 6.0, either with a fast native solution or by adapting our current Weierstrass-oriented engine (no speed advantage).
, and a lot of the curves that are implemented have known issues (e.g. secp256k1).
These are primarily ease-of-secure-implementation issues and I think Crypto++ has mitigated them.
I'm no expert, but would I be right in thinking that the work that needs doing is: (1) adding a new OID to |oids.h|, and (2) adding the curve parameters to |eccrypto.cpp|?
The safe curves require a little bit more to do. While we indeed would have to add OIDs and store the parameters somewhere most of the curves (in montgomery and edwards representation) require a different format and different operations (you may see now why it hasn't been done yet).
As far as I can see, Crypto++ doesn't currently implement any of the safe curves listed on that page
As JPM said, its planned.
but would I be right in thinking that the work that needs doing is: (1) adding a new OID to oids.h, and (2) adding the curve parameters to eccrypto.cpp?
(3) Test cases, and lots of them. I feel comfortable when we have something for testing and validation. If we lack them, then I start getting nervous because of bugs and interop problems. We're suffering one of them now with ECIES because we lacked a specification with test cases, and we lacked other software to test interop.
We also need to verify the curves can be represented. Things may be OK out of the box, or it may need a specialization.
This came out today from the IETF: CURDLE (CURves, Deprecating and a Little more Encryption). CURDLE includes curve448 and curve25519. I don't know if the administrivia has been completed, like assigning IANA numbers, OIDs and such.
It looks like the IETF has recently published some information on curve448 and curve25519: Using Curve25519 and Curve448 Public Keys in PKIX. It includes OID assignments.
We also need to verify the curves can be represented. Things may be OK out of the box, or it may need a specialization.
OK, so this looks like the pain point. The library can currently handle curves of the form (from Certicom's SEC2, Section 2.1): y2 = x3 + a * x + b (mod p).
However, the Montgomery representation of the Edwards curves is: v2 = u3 + a * u2 + u (mod p). That means the various GroupParameters_EC< T >
(where T
is ECP
or EC2N
) won't work. Instead, we will likely need GroupParameters_ED< T >
(where T
is ECP
). Noticed we changed the *_EC
to *_ED
.
I think we will need to study this a little more before we start forking implementations or supplementing behaviors. The math looks OK, its the "clean cut-in" which makes sense from a software engineering perspective that needs some attention.
JPM and Uri have a better understanding of the underlying mathematics, so all three of us will probably need to contribute cycles.
They are not yet done (as you can see, this is a draft and not an RFC). Furthermore there's even a dedicated draft for Curve25519 and Curve448. https://tools.ietf.org/html/draft-irtf-cfrg-curves-11
Am 20.11.2015 um 12:32 schrieb Jeffrey Walton:
It looks like the IETF has recently published some information on curve448 and curve25519: Using Curve25519 and Curve448 Public Keys in PKIX https://tools.ietf.org/html/draft-josefsson-pkix-newcurves.
— Reply to this email directly or view it on GitHub https://github.com/weidai11/cryptopp/issues/67#issuecomment-158368843.
As for the below issue, we'd need to decide how to approach the Edwards / Montgomery support.
a) Create the _ED classes and let them only do I/O conversions so we can continue using our Weierstrass code doing the job (fastest solution, I'm sure we could do this for 5.7, would be temporary) b) Create the _ED classes and let them do the full job. (i.e. let them do the actual math -> we need to implement the math and optimize it (ASM), this would have to wait for 6.0 IMO)
a) would be possible as per http://crypto.stackexchange.com/q/27842/23623
We also need to verify the curves can be represented. Things may be OK out of the box, or it may need a specialization.
OK, so this looks like the pain point.
The library can currently handle curves of the form (from Certicom's SEC2, Section 2.1 http://www.secg.org/SEC2-Ver-1.0.pdf) /y^2 = x^3 + a * x + b (mod p)/.
However, the Montgomery representation of the Edwards curves is: /v^2 = u^3 + a * x^2 + u (mod p)/. That means the various |GroupParameters_EC< T >| (where |T| is |ECP| or |EC2N|) won't work. Instead, we will likely need |GroupParameters_ED< T >| (where |T| is |ECP|). Noticed we changed the |EC| to |ED|.
This would be approach a) from above because our ECP code is Weierstrass only (AFAICT).
I think we will need to study this a little more before we start forking behaviors. The math looks OK, its the "clean cut-in" which makes sense from a software engineering perspective that needs some attention.
a) Create the _ED classes and let them only do I/O conversions so we can continue using our Weierstrass code doing the job (fastest solution, I'm sure we could do this for 5.7, would be temporary) b) Create the _ED classes and let them do the full job. (i.e. let them do the actual math -> we need to implement the math and optimize it (ASM), this would have to wait for 6.0 IMO) a) would be possible as per http://crypto.stackexchange.com/q/27842/23623
From the answer on the Crypto.SE:
... None of these conversions are terribly expensive
compared to the cost of a scalar multiplication.
Is the author saying, "efficiency is equivalent because multiplication is expensive in all the representations"? Or is he saying to avoid a particular representation because multiplication is expensive?
Also, we have to keep an eye on the near-constant time security property in the Edwards representations. Does that hold in the other representations? Or maybe I should back up and ask, should we worry about it?
By the way, one of your wishes has moved forward: 5.6.3 added a Name::Blinding
parameter to control timing attack remediations. We can disable blinding as desired in implementations like AES and RW signatures. The downside is no code uses it (yet).
Am 20.11.2015 um 17:02 schrieb Jeffrey Walton:
a) Create the _ED classes and let them only do I/O conversions so we can continue using our Weierstrass code doing the job (fastest solution, I'm sure we could do this for 5.7, would be temporary) b) Create the _ED classes and let them do the full job. (i.e. let them do the actual math -> we need to implement the math and optimize it (ASM), this would have to wait for 6.0 IMO) a) would be possible as per http://crypto.stackexchange.com/q/27842/23623
From the answer on the Crypto.SE:
|... None of these conversions are terribly expensive compared to the cost of a scalar multiplication. |
Is the author saying, "efficiency is equivalent because multiplication is expensive in all the representations"? Or is he saying to avoid a particular representation because multiplication is expensive?
He's saying: "You don't have to worry too much about loosing much performance if you do the conversion, it's rather cheap if you compare it to the operation following it." He's not saying Weierstrass is faster (because it isn't) and clearly saying that using Montgomery ladder is better is obviously true, but this was out of the scope of the question.
I guess we get something like a few per cent overhead on the ECC code using this conversion (Edwards -> Montgomery -> Weierstrass) (and less if we can actually make it work to only use on external input / output)
Also, we have to keep an eye on the near-constant time security property in the Edwards representations. Does that hold in the other representations? Or maybe I should back up and ask, should we worry about it?
Our Weierstrass code is constant time / timing attack resistant (at least I hope so). If we apply the conversions on in- and output we should get a timing attack resistant Montgomery / Edwards module. Of course this isn't a replacement for a real implementation of Montgomery / Edwards arithmetic, but we can delay those for more pressing issues (like the lack of certified RNGs).
I'll look into the related ECC code as soon as I've some time (and maybe code the necessary changes straight away).
By the way, one of your wishes has moved forward: 5.6.3 added a |Name::Blinding| parameter to control timing attack remediations. We can disable blinding as desired in implementations like AES and RW signatures. The downside is no code uses it (yet).
Yippie :) ut I think we should make it rather hard to disable our counter measures (they are in place for a very good reason) or at least place a big warning sign.
— Reply to this email directly or view it on GitHub https://github.com/weidai11/cryptopp/issues/67#issuecomment-158442415.
I guess we get something like a few per cent overhead on the ECC code using this conversion (Edwards -> Montgomery -> Weierstrass) (and less if we can actually make it work to only use on external input / output)
OK, so maybe what we want for a quick cut-in is a stand alone function similar to what DSA provides for converting between P1363, Java/.Net and OpenPGP signature representations. The DSA function is called DSAConvertSignatureFormat. Maybe it could be called ECConvertCurveRepresentation
or similar....
I'm not sure about the hand tuned assembly for the curve. It seems like a lot of work for a few classes. Plus, Integer
is already tuned for assembly language.
I guess we get something like a few per cent overhead on the ECC code using this conversion (Edwards -> Montgomery -> Weierstrass) (and less if we can actually make it work to only use on external input / output)
OK, so maybe want we want for a quick cut-in is a stand alone, |RepresentationConverter| function similar to what DSA provides for converting between P1363, Java/.Net and OpenPGP signature representations (http://www.cryptopp.com/docs/ref/dsa_8h.html).
Sure this will work.
The only issue with this is that developers may not be aware of such a
function and will be wondering why the output point (with their test
vector) is wrong, because ECDH
Also, to get an idea for the breadth of this... Check out DL_GroupParameters and DL_FixedBasePrecomputation, and everything that has to be implemented to support them.
I believe Wei mentioned he had a preliminary implementation of Curve25519 at some point, and that he was seeing good performance. However, I'm not sure he has published that code. I'm not sure whether he feels whether that code could be used by the project or not, but it might not hurt to ask?
This just made the radar from the TLS Working Group: Early code point assignments for 25519/448 curves.
That's good news.
I've had got some time to look into our ECC code. I think we need to re-implement ECP for montgomery and edwards as this is the main way the library handles different curves (see EC2N for reference). The DL_..._EC classes mainly handle generic ECC stuff but make no distinction between EC2N and ECP (only one class is specialized) so we'd use them to implement (classic) ECDSA (and ECDH) for Ed curves. For the how we could give each of the ECPM and the ECPE(d?) classes an instance of the lower level (ECPE -> ECPM -> ECPW) class for processing.
Am 23.11.2015 um 15:28 schrieb Jeffrey Walton:
This just made the radar from the [TLS Working Group: Early code point assignments for 25519/448 curves https://www.ietf.org/mail-archive/web/tls/current/maillist.html.
Thanks JPM. So the issue or problem statement seems to lie somewhere around improving support for the form or shape of the curve in an extensible way.
The existing routines are generic, so they are probably not as efficient as they could be. If we simply cut-in the Edwards stuff using what we have, then there will be opportunity for improvement (in addition to the exiting opportunity form improvement) .So the open question (for me) is, do we want yo...
_First option_
EC
|
+------+------+---+---+-------+-------+
| | | | | |
ECPE ECPM ECPW EC2NE EC2NE EC2NW
The above uses the existing DL_GroupParameters_EC<EC>
and effectively combines field and form into a single template parameter.
I mostly understand where this lies in the grand scheme of things, and how it intersects with software engineering.
This will likely add to the EC
object vtables. The optimizer may be able to optimize it away.
_Second option_
DL_GroupParameters_ECM
, DL_GroupParameters_ECW
, DL_GroupParameters_ECE
.
The above provides an implicit specialization on the form, and will likely need to inherit from DL_GroupParameters_EC
.
I don't know about this one... This is kind of like a cross between function specializations and unrolling into plain functions (with some hand waiving). I'm trying to understand it better, and here's some reading on it: Why Not Specialize Function Templates?
This will likely add to the DL_GroupParameters_EC
object vtables. The optimizer may be able to optimize it away.
_Third option_
struct EC_Shape
{
enum { Weierstrass=1, Montgomery, Edwards };
};
template <class EC, class FORM>
class DL_GroupParameters_EC
{
...
};
DL_GroupParameters_EC<ECP, Weierstrass>::Intialize(...)
{
...
}
I mostly understand where this lies in the grand scheme of things, and how it intersects with software engineering.
This may not add to the object vtables. The optimizer may be able to optimize it away.
And what do we do if we want to add a new form or shape? Maybe someone is testing something that's not-in-the-box, so they need some extensibility.
In this case, I'm thinking (3) is most extensible. It can handle long and short Weierstrass, a twisted Edwards, etc.
For the case against (3) the library mostly uses (2) for the specializations. The specializations are a mashup of concepts: first is the scheme or algorithm specialization. gfpcrypt
has a bunch of these - one for DSA, one for ElGamal, etc. It basically provides policy. Second is efficiency for field operations. But they all appear to be generic, and none of them appear to specialize in an effort to improve speed or efficiency.
So maybe what we want is a combination of (2) and (3).
Sorry for the delay in my answer.
Am 25.11.2015 um 03:12 schrieb Jeffrey Walton:
Thanks JPM. So the issue or problem statement seems to lie somewhere around improving support for the form or shape of the curve in an extensible way.
The existing routines are generic, so they are probably not as efficient as they could be. If we simply cut-in the Edwards stuff using what we have, then there will be opportunity for improvement (in addition to the exiting opportunity form improvement) .So the open question (for me) is, do we want yo...
/First option/
EC +------+------+---+---+-------+-------+ ECPE ECPM ECPW EC2NE EC2NE EC2NW Are there even edwards / montgomery curves over EC2N ? The above uses the existing |DL_GroupParameters_EC
| and effectively combines field and form into a single template parameter. I mostly understand where this lies in the grand scheme of things, and how it intersects with software engineering.
This will likely add to the |EC| object vtables. The optimizer may be able to optimize it away.
We do have an (abstract) EC class!? - The documentation tells me "no". Or is that your proposal to implement one?
Personally I prefer this approach as it would also allow us to "easily" add the new curve Microsoft is promoting without having to re-implement DL_GroupParameters_ECX. And as far generically side-effects go, I think we're good with that if we use the templates. All the math is done in the ECP / EC2N classes at the moment and the DL_GroupParameters classes provide a nicer interface to the raw ECP / EC2N methods.
/Second option/
|DL_GroupParameters_ECM|, |DL_GroupParameters_ECW|, |DL_GroupParameters_ECE|.
The above provides an implicit specialization on the form, and will likely need to inherit from |DL_GroupParameters_EC|.
I think this would force us to re-work the ECP and EC2N classes as the W/E/M addition laws are all different and atm implemented in ECP/EC2N.
I don't know about this one... This is kind of like a cross between function specializations and unrolling into plain functions (with some hand waiving). I'm trying to understand it better, and here's some reading on it: Why Not Specialize Function Templates? http://www.gotw.ca/publications/mill17.htm
This will likely add to the |DL_GroupParameters_EC| object vtables. The optimizer may be able to optimize it away.
Or even add a template parameter:
struct EC_Shape { enum { Weierstrass=1, Montgomery, Edwards }; }; template <class EC, class FORM> class DL_GroupParameters_EC { ... }; DL_GroupParameters_EC<ECP, Weierstrass>::Intialize(...) { ... } Same as above, although maybe actually worse. Maybe (I'm not sure on this topic) we need to implement the actual specialization in the header files and we'd have to mess with the different addition laws and implement like a million different specializations (f.ex. for the MS curve). I mostly understand where this lies in the grand scheme of things, and how it intersects with software engineering.
This may /not/ add to the object vtables. The optimizer may be able to optimize it away.
And what do we do if we want to add a new form or shape? Maybe someone is testing something that's not -in-the-box, so they need some extensibility.
Microsoft's new crazy curve beating Curve25519 on speed. https://eprint.iacr.org/2015/565.pdf
In this case, I'm thinking (3) is most extensible. It can handle long and short Weierstrass, a twisted Edwards, etc.
IMO (1) is the most extensible as this completely separates math from administration and interfacing and allows for quite fast addition of new curve shapes. Note that standard Edwards curves seem to be a special case of the twisted ones (a=1) and the addition law is also just a special case. AFAICT: https://en.wikipedia.org/wiki/Twisted_Edwards_curve https://en.wikipedia.org/wiki/Edwards_curve
@JPM - thanks for the reply.
My apologies for your displeasure or concern at Communication / statement of priorities for library development. Or maybe my inability to articulate well.
My apologies for asking you this a couple of times: are you working on this, and do you have a working implementation or a diff of something? If not, then I suggest you consider implementing some of your ideas.
Also, there's no need to sign your GitHub responses, and leave old signature blocks. You can remove them.
Am 27.11.2015 um 03:08 schrieb Jeffrey Walton:
@JPM https://github.com/JPM - thanks for the reply.
My apologies for your displeasure or concern at Communication / statement of priorities for library development http://groups.google.com/d/msg/cryptopp-users/lpQS8PPhJ0c/CDpgihyXBQAJ. Or maybe my inability to articulate well.
This is no criticism on you. I've a pretty good idea on what you're doing and why you're doing it. But others may have not and I think they should know (or at least I'd like to know if I weren't involved this much ;) And it may ease up coordination a bit more, so nobody focuses too much on the fancy Skein stuff while wasting time that could be used for the better ECC code.
My apologies for asking you this a couple of times: are you working on this, and do you have a working implementation or a diff of something? If not, then I suggest you consider implementing some of your ideas.
Well, I thought it was standard to design first and then implement and so I took the designer's standpoint in my previous replies. I didn't realize (until now) that it's standard to do things as "Implement and do the design on the go and adapt later if necessary". Sorry about my confusion there, I'll provide a working (I hope) implementation of my proposals (working ECPE and ECPM classes) within 24 hours (although it may not be as fast as possible using the "hacky" approach)
Also, there's no need to sign your GitHub responses, and leave old signature blocks. You can remove them.
Sorry for that one. I've enabled default-sign for this account and sometimes forget to disable it when sending to github (it's already auto-removed for the group, but that's a static recipient address and thus is easier)
Am 27.11.2015 um 03:08 schrieb Jeffrey Walton:
@JPM https://github.com/JPM - thanks for the reply.
My apologies for your displeasure or concern at Communication / statement of priorities for library development http://groups.google.com/d/msg/cryptopp-users/lpQS8PPhJ0c/CDpgihyXBQAJ. Or maybe my inability to articulate well.
My apologies for asking you this a couple of times: are you working on this, and do you have a working implementation or a diff of something? If not, then I suggest you consider implementing some of your ideas.
I've got a full implementation of ECPM now. ECPE won't be too different and will likely be done until tomorrow. However I can't test things without having a test curve with an OID (e.g. M-221, M-511, M-383, Curve383187, Curve25519, Curve448) but none of them has one. The library won't run without at least one curve with an OID due to the nature of DL_GroupParameters_EC.
AND I'm done with the edwards code.
All that is now needed for the library to support ECPM and ECPE is only OIDs for curves. As soon as I have some (maybe some fake / dummy ones?) OIDs I can provide you with a full 5.6.3 fork that adds ecp{e|m}.{cpp|.h} and updates eccrypto.{h|cpp}, ecp.h (adds typedef for ECPW) and our testing code.
Note that as far as details are concnerned, everything is native Montgomery / Edwards processing except for the SimultaneousMultiply() functions which use the linked Weierstrass trick.
Back in 2000/2001, I registered an OID for a project I was working on that never amounted to anything. You can find it here:
http://oid-info.com/get/1.3.6.1.4.1.9509
I hereby grant you the following OID to do with as you like, within the scope of the Crypto++ project:
1.3.6.1.4.1.9509.5
You can now make arbitrary OIDs under this root. For example, you can define:
1.3.6.1.4.1.9509.5.1 = Curve25519
Thanks Denis. This will be very helpful in the future. I could have used them a few years ago :).
I believe the IETF provided the following in June 2015 via Using Curve25519 and Curve448 in PKIX:
id-Curve25519 OBJECT IDENTIFIER ::= { 1.3.6.1.4.1.11591.7 }
id-Curve447 OBJECT IDENTIFIER ::= { 1.3.6.1.4.1.11591.8 }
GnuPG used these for the OpenPGP format:
{ "Curve25519", "1.3.6.1.4.1.3029.1.5.1", 255, "curve25519" },
{ "Ed25519", "1.3.6.1.4.1.11591.15.1", 255, "ed25519" },
I also looks like Google used the following in one of its projects:
+ 'CURVE_25519': [0x0A, 0x2B, 0x06, 0x01, 0x04,
+ 0x01, 0x97, 0x55, 0x01, 0x05, 0x01],
+ 'ED_25519': [0x09, 0x2B, 0x06, 0x01, 0x04, 0x01, 0xDA, 0x47, 0x0F, 0x01]
This leads to an interesting little quandary... Crypto++ does not have aliases for its curve registry. If we want to add 1 curve that goes by 2 different names, then we have to add full entries to the database.
With a unique entry for each name, the library will be able to read a key using either name. That is, the following will "just work": publicKey.Load(bt)
. However, when we perform a publicKey.Save(bt)
, which OID do we write?
With Curve25519, I believe the public key should just be the 32 byte value, fixed length, no decoration.
So, no need to worry about encoding the OID, because it should probably not be encoded...
At all: I'm right now working on finishing the ECPM and ECPE code (sorry for the delay). I expect it to be finished in at maximum 12 hours and will provide a patched 5.6.3 (which some Linux mastermind may usefully diff to patch it into our master branch)
The issues that aren't handled right now (and if you want them you'll have to wait a bit longer) are standardized encoding automatic ECPE <-> ECPM conversion as (I feel is) suggested by the IRTF drafts.
Now for some responses:
Denis:
Back in 2000/2001, I registered an OID for a project I was working on that never amounted to anything. You can find it here:
http://oid-info.com/get/1.3.6.1.4.1.9509
I hereby grant you the following OID to do with as you like, within the scope of the Crypto++ project:
1.3.6.1.4.1.9509.5
Thank you for this assignment :) I've already used it (see below).
You can now make arbitrary OIDs under this root. For example, you can define:
1.3.6.1.4.1.9509.5.1 = Curve25519
I just added a few OIDs in our code. I'm not sure whether we must / should register the OIDs somewhere.
Now for the allocations I made in the code (can be changed until release of 5.7 I guess):
1.3.6.1.4.1.9509.5 is now called "cryptopp()" in our OID code 1.3.6.1.4.1.9509.5.1 is now called "cryptopp_montgomery_curves()" in our OID code // I did this to ensure our OID handling code doesn't fail // I've set a cryptopp before the curve name to show that we assigned these values, not some standardization body and that the OIDs may be removed 1.3.6.1.4.1.9509.5.1.1 is now cryptoppM221 // this and the following are listed on the safe-curves project as montgomery curves, http://safecurves.cr.yp.to/equation.html 1.3.6.1.4.1.9509.5.1.2 is now cryptoppM383 1.3.6.1.4.1.9509.5.1.3 is now cryptoppM511 1.3.6.1.4.1.9509.5.1.4 is now cryptoppCurve383187 // Curve 25519 and Curve448 can be found at a different place, see below 1.3.6.1.4.1.9509.5.2 is now called "cryptopp_edwards_curves()" in our OID code 1.3.6.1.4.1.9509.5.2.1 is now cryptoppE222 1.3.6.1.4.1.9509.5.2.2 is now cryptoppE382 1.3.6.1.4.1.9509.5.2.3 is now cryptoppEd448 1.3.6.1.4.1.9509.5.2.4 is now cryptoppE521 1.3.6.1.4.1.9509.5.2.5 is now cryptoppCurve41417 1.3.6.1.4.1.9509.5.2.6 is now cryptoppCurve1174
Jeffrey:
With a unique entry for each name, the library will be able to read a key using either name. That is, the following will "just work": |publicKey.Load(bt)|. However, when we perform a |publicKey.Save(bt)|, which OID do we write? We write the OID we read from or use the more standard one if we didn't read the curve parameters from somewhere. This means our cryptoppXXX OIDs will always lose, as does the custom GPG OID against IETF OIDs.
Jeffrey:
I believe the IETF provided the following in June 2015 via Using Curve25519 and Curve448 in PKIX https://tools.ietf.org/html/draft-josefsson-pkix-newcurves:
|id-Curve25519 OBJECT IDENTIFIER ::= { 1.3.6.1.4.1.11591.7 } id-Curve447 OBJECT IDENTIFIER ::= { 1.3.6.1.4.1.11591.8 } | ||I didn't add them, because the draft changed the OIDs. I added the new ones.
Jeffrey:
GnuPG used these for the OpenPGP format http://lists.gnupg.org/pipermail/gnupg-devel/2015-July/030104.html:
|{ "Curve25519", "1.3.6.1.4.1.3029.1.5.1", 255, "curve25519" }, { "Ed25519", "1.3.6.1.4.1.11591.15.1", 255, "ed25519" }, | ||I'd prefer not to add them. a) the second one is exactly the same as is allocated by the IETF for Curve25519, so we don't get anything from that and b) the first one has a more standardized version nowadays (i.e. the second one).
Denis:
With Curve25519, I believe the public key should just be the 32 byte value, fixed length, no decoration.
So, no need to worry about encoding the OID, because it should probably not be encoded...
I'm not sure how this is handled with the encoding standards but usually, you do encode the OID to ensure the recipient knows which curve is in use.
— Reply to this email directly or view it on GitHub https://github.com/weidai11/cryptopp/issues/67#issuecomment-161124203.
Small update: All curves are now in our database (excluding Ed25519) and I've swapped the OIDs of Curve1174 and Curve41417 to reflect their respective field sizes (Curve41417's prime is much larger than Curve1174's).
The only thing left now is the addition of test cases which I'll do right now.
Awesome!
With regard to Curve25519 keys - in SSH, for example, the protocol provides its own information about key type, and doesn't use the OIDs generally. Perhaps it is done differently in other protocols.
@denisbider, TLS for example can use OIDs to designate named curves (so you don't have to transmit parameters).
I'm sorry having to disappoint everyone reading this: While I do have the test cases, some issues arose (I didn't do the conversion (Weierstrass <-> Montgomery) properly, I made the new entries in our namedcurves database wrong (turned out a specific byte count was mandatory) and there's a yet-to-be-debugged error which lets a test fail (which checks whether q*G=O)). I'm working on it and hope to be able to provide you with a fully debugged version until the end of the week.
SSH uses OIDs for ECDSA / ECDH also, but the DJB curves are sufficiently different that they call for individual treatment. So although Curve25519, Ed25519, and Ed448 are being standardized, no OIDs are being used there.
No worries about the issues. :) Thank you for investing the effort!
I'm sorry to having to bring you guys bad news.
I managed to finish large portions of the ECPE/ECPM code (need still to check for special point encoding though), but an issue arose: The (large) scalar multiplication is not working out as expected. It fails the test vectors from IRTF and the library internal tests (i.e. the test whether order*Basepoint=Point at infinity.)
I don't know why it's failing as it is. I've tried debugging this issue for ~1 week now (and found some other mistakes I made but not caused the issue), so I'm asking whether somebody could look over the code (only validat2.cpp, ECPM.h and ECPM.cpp are important to this bug) and try to find the issue.
I'll provide a full set of links for ressources (to confirm my math is right) and to the test vectors and the curve parameters. I'll also leave my debugging code in there but #if 0'ed. The final version will be stripped of that stuff. Right now I'm preparing the ZIP file and will upload it here shortly.
The zip archive is now attached. Note that the ECPE code should still be broken (but I'll fix all of that once the current issue is fixed). Note further that only ietfCurve25519() should give relieable tests as the others should have encoding issues (numbers being dropped and such), I'll fix this as well before I'll call it "production ready / finished".
The links I promised: http://safecurves.cr.yp.to/field.html for the field primes http://safecurves.cr.yp.to/equation.html for the equations (and for confirming the transfomations are right) http://safecurves.cr.yp.to/base.html for
the base points and the their orders http://safecurves.cr.yp.to/ladder.html for the cofactors https://hyperelliptic.org/EFD/ for the addition and doubling laws used
The rest is just a copy & adapt from our existing code (ecp.cpp, ecp.h).
I removed some unneccessary parts from this copy (i.e. VS2010.zip which is applied, all .cmd files and the .sh file) because I suspected this was why GitHub didn't want me to upload it.
[File (sorry, Github refused to work)](https://muench-ger.net/cryptopp563 ECC Test.zip)
Interesting discussion from TLS WG: draft-ietf-tls-curve25519-01 and the X25519 significant bit.
@noloader - interesting discussion.
@DevJPM - I'm sorry I don't have any ideas about what might be causing the multiplication bug. :( I did look at the code, but I generally lack implementation experience in this, and would probably have to invest as much time as you did to catch up. :-|
I removed some unneccessary parts from this copy (i.e. VS2010.zip which is applied, all .cmd files and the .sh file) because I suspected this was why GitHub didn't want me to upload it.
@DevJPM - You might want to fork Wei's GitHub of Crypto++ rather than making copies and uploading them. It might be easier to for you to take the little tweaks flowing downstream, and provide tweaks back to upstream.
I've been trying to use GitHub's Pull Request feature to ensure folks get credit for their contributions. If your provide a Pull Request (PR) based on a fork, then I can check it out, test it and ensure your contributions are listed at Crypto++ Contributors. I'd like to see folks like you, Uri, Denis, etc listed there.
You can also rename the fork to CryptoJPM so your branding will survive.
@noloader Yes, I'll provide you with a proper pull request as soon as I have something nice for contribution.
@denisbider Thank you for your efforst, at least I can now exclude some things that I may have ignored out of my Ignorance :)
I now (pre-debugging) have a hypothesis (yay): When doing the expensive exponentiation, the library automatically converts all related numbers to the Montgomery Representation. This works fine if done properly. I now suspect that the crucial parameters ( a and b) get converted twice. This would change them to meaningless numbers causing the issues we're seeing.
My strategy: Go ahead and grab the needed numbers from SafeCurves and do the parameter conversion by hand and verify the conversion is done right. Then look at an errorneous case and dump the parameters used by the computer engine without, with and with double FromMontgomery conversion.
If this approach failes I'll check all numbers for correctness.
I hope to have the issue fixed until tomorrow evening. If everyone passes by well, I'll also be able to fix the remaining database entries and get ECPM running. In ideal case ECPE will also work (when I applied what I learned from ECPM). So if I expect a long delay between finishing ECPM and ECPE I'll get a ECPM-(somewhat-)only PR out.
So far for the update.
I may have finally found the issues causing the (up to now) pain.
Issue 1) The converted curve parameter "a" was calculated in a bad way. Checking the math by hand (using a reliable CAS) showed that the "a" parameter of the equivalent weierstrass curve was indeed calculated wrongly.
Issue 2) Use of wrong point for exponentiation. The code should convert the base point from the Montgomery curve to the Weierstrass curve, multiply it there and convert the result back. Instead it calculated the Weierstrass base point, dropped it, multiplied the Montgomery point with the exponents and converted the results back.
However the code is not fully working (yet). There still seems to be an issue with parameter encoding, as self-validations fail after the curve has been serialized and de-serialized again. However they don't appear when OID encoding is enabled.
I hope to fix this bug soon (until this evening)
Nice work! Thank you, JPM :-)
A quick update:
The serialization bug appeared because the code that de-serialized still used the old (bad) logic to calculate the a parameter. I've fixed this code and moved the construction of the Weierstrass curve from the Montgomery curve to its own function. This now works.
The current issue is, that if you call .Precompute() on the key object you're using (via ECDSA or ECIES) this somehow changes the interaction with the curve class and makes the self-tests fail. I'll look for a way to a) (ideal solution) forward the .Precompute() to the underlying compute engine and thus make it work again (and faster?) or b) disable the changed behavior based on the prior Precompute() call.
@noloader, There are some other tricks to pull off (fixed exponent speed-ups) with RSA when you have the option to pre-compute
For the EC classes .Precompute() leads to DL_FixedBasePrecomputationImpl
Note that I can't simply opt-out from Precomputation. Neither EcPrecomputation (or its base class) nor ECP (or its base class AbstractGroup) allow for anything like SupportsPrecomputation() this is a thing that is handled at a higher level (i.e. DL_GroupParameters_EC<> AFAICT) and I really don't want to mess with DL_GroupParameters_EC.
Yet another update.
I didn't fix all the issues (yet).
I however did identify and fix an issue within the doubling function which gave bad results from there.
And I've further info on the Precomputation issue. It's not an issue of precomputation. The real issue is that if you use Abstract Group's (small scale) scalar multiplication or you use the one that is used if precomputation was done (two different code paths) you get bad results (although they're the same for both paths).
This gets more and more weird. Right now even convert-to-Weierstrass-and-back code yields the wrong result for my KAT (e=3) (against 2G+G). So apparently it's some "larger" issue with my code... I'll have to take the update back, the Weierstrass intermediate, does yield correct results.
I really hope this is the last issue I have to fix with the Montgomery code.
Well, I identified one source of issues:
I used the hard-coded "1" because it was there in the formula, but forgot to convert it to Montgomery Representation, basically yielding wrong results whenever this was enabled (i.e. when Pre-Computations is used)
The (montgomery) code passes now all self-test (YIIPPIIE), except one where it crashes right now, which seems to be related to decompression, I'm investigating
Chances are however that I'll be able to fix this one.
-> Done. I forgot to put a %p
at the end of the line calculating the y-coordinate, oops.
Next step would be to head over to the database of curves and fix their format.
-> Done. All Montgomery curves now pass the standard level 2 self tests.
Only issue left: Fix, polish and finish the IRTF KATs.
Then I'd check time and decide whether its worth it to also fix the Edwards code today.
I've now finished my work on the montgomery curves.
you can view the changes in my forked repo: https://github.com/DevJPM/cryptopp
Edwards support is still missing (it's causing more some issues right now) and the IRTF KATs are also not included - for now.
Nearly forgot: I didn't handle the stuff related to makefiles and the VS solution files. So when then PR gets accepted another commit has to be done to include the new files (ecpm.cpp, ecpm.h, ecpe.cpp and ecpe.h) in the appropriate places.
Depending on what the consensus is, I can pull request montgomery and edwards separately or we wait a little bit more (one or two weeks) and I can PR the edwards code as well.
Thanks JPM. What is the best way to provide feedback?
I imagine there's going to be a lot of trivial items in the first round of testing. Nothing major, just things like "Linux and GCC need X" and "OS X and Clang need Y". These are minor details that don't require a lot of discussion, so I don't see a lot of value in running them through a bug report (unless you want them there).
In the interim, how about setting up a couple of VMs for additional testing? For Windows, Oracle's Virtual Box is usually a good choice. Load it with the latest Fedora (23, IIRC) and either Debian Stable (8.2) or Ubuntu (14.04 LTS).
_If_ you choose VirtualBox, then drag and drop between the host and guest is completely broken. Its been that way for years, and its not clear to me why its even an option. In this case, you can SSH into the Linux guest; or you can set up shared folders.
VMware is another popular choice. Drag and drop between the host and guest usually works. I can't say always works because it did some nasty things to me at the 5.6.3 release.
I'd recommend you avoid Debian testing and unstable, but feel free to use them if you are OK with bleeding edge. I just know they cause me a lot of grief and I prefer to avoid them.
What is the best way to provide feedback?
@noloader just place a comment here or e-mail me if it is something longer.
Concerning the files, I just looked through them and I think we "only" need to add the two (four) new files (as entries) to filelist.txt, cryptlib.vc(x)proj, cryptdll.vc(x)proj
Concerning my short-term roadmap: I'll fix and finish ECPE and then I'll run the requested tests (afterwards) on my PC and on a ARMv7(?; RPi 2; Debian stable) device I own.
ETA of everything: end of January
@noloader just place a comment here or e-mail me if it is something longer.
OK, thanks. What email address would you like to use?
@noloader jean-pierre.muench at web dot de will do.
@JPM,
I'll fix and finish ECPE and then I'll run the requested tests (afterwards) on my PC
OK, thanks. Running the tests locally will save you a lot of headaches later (speaking from experience).
You might also consider a test account on the GCC Compile Farm Project. It will get you access to hardware and compilers you may not have. It fills in gaps in my local test environment.
on a ARMv7(?; RPi 2; Debian stable) device I own.
How is the Rhaspi working for you? I have one but its older. I assumed it did not meet minimum requirements. I did not even try to set it up as a test environment.
I have a BeagleBone Black which provides ARM-HF (hard floats). After installing a Debian 8.2/LXDE flasher image, it was constantly running out of storage. I finally installed a console build (no window manager, no web server, etc), and I've got over 3.1 GB available.
Hey guys,
I'm just wondering - what is the current state of the safe curve implementation?
I'm in a position where it looks like I'll need to implement curve448-sha512 in SSH for key exchange, in order for it to have any hope of being standardized. I was hoping I'd be able to use Crypto++ for this, but it looks like time pressure is ramping up.
This has been a long time coming... We added x25519 key exchange and ed25519 signature support at Commit 13ea8f374f82aef2. Also see Pull Request 566, Add interface to TweetNaCl library.
Cleared at Crypto++ 6.0 release.
I'm going to open separate bug reports for curve448 and another one I need for IBE.
@noloader Are there any plans to add support for the curve25519 and curve488 outside of the nacl interface? I.e. make it usable like all the other curves (like secp, brainpool, ...)?
@Skycoder42,
Yes, I am working on native support now. The curve is tricky to integrate.
I'm also working on NaCl wrapper classes. NaCl is not easy to use as a set of stand alone functions.
Has there been any movement on the native support for curve488? I am writing a library for which Crypto++ would be ideal, but without this feature I will need to use a different library.
The following page gives a list of safe curves to use with ECC:
http://safecurves.cr.yp.to/
As far as I can see, Crypto++ doesn't currently implement any of the safe curves listed on that page (e.g. Curve25519), and a lot of the curves that are implemented have known issues (e.g. secp256k1).
I'm no expert, but would I be right in thinking that the work that needs doing is: (1) adding a new OID to
oids.h
, and (2) adding the curve parameters toeccrypto.cpp
?