serenity-kit / opaque

Secure password based client-server authentication without the server ever obtaining knowledge of the password. Implementation of the OPAQUE protocol.
https://opaque-auth.com
MIT License
38 stars 5 forks source link

Feature Request: Allow specifying Argon2 parameters #119

Open TylerWanta opened 2 weeks ago

TylerWanta commented 2 weeks ago

I noticed that the default parameters are being used for the argon2 package, which seem to be a bit below the recommended parameters outlined in the protocol.

Could an update be made to support specifying different parameters for argon2?

TylerWanta commented 2 days ago

I created a Pull Request with the proposed solution.

nikgraf commented 1 day ago

hey @TylerWanta, sorry for the late reply. If the recommended parameters are too low, I think we should focus on fixing the defaults than allowing different parameters.

My current assessment:

The defaults in https://docs.rs/argon2/0.5.3/argon2/struct.Params.html#associatedconstant.DEFAULT_M_COST are

DEFAULT_M_COST: u32 = 19_456u32
DEFAULT_T_COST: u32 = 2u32
DEFAULT_P_COST: u32 = 1u32
DEFAULT_OUTPUT_LEN: usize = 32usize

Spec currently recommends:

Argon2id(S = zeroes(16), p = 4, T = Nh, m = 2^21, t = 1, v = 0x13, K = nil, X = nil, y = 2)

https://datatracker.ietf.org/doc/draft-irtf-cfrg-opaque/

Conclusion:

P (paralellism) cost is 1, but should be 4 t (iterations) cost is 2, but should be 1 M (memory) cost is 19456 KiB, but should be (2^21) = 2097152 KiB V0x13 (already the default)

Is this correct? If so I think we should update the defaults in the library to match the spec. Or do you have any good reasons to use other parameters?

TylerWanta commented 1 day ago

Admittedly, I was only thinking about the low parallelism cost when originally looking at the parameters. Now, seeing the jump in memory (~20MBs -> ~2GBs) is kind of concerning.

Doing some research, it seems like 2GBs is quite a high cost. Testing this locally, it takes 14.71s to fun the algorithm with the recommended parameters (t=1, p=4, m=2097152). My computer specs are even quite high:

Graphics Card: NVIDIA GeForce RTX 4080
Processor: 13th Gen Intel(R) Core(TM) i9-13900KF 3000 Mhz, 24 Core(s)
Ram: 32 GBs

Considering this is an OPAQUE package, updating the parameters to the recommended still seems like a good idea to me, but, with the memory cost as high as it is, allowing developers to tone it done a bit would be needed. Otherwise, forcing the defaults could easily prevent developers from being able to utilize the package due to the abnormally high resource usage.

FWIW, the default argon2 parameters take me ~250ms to run, which is a bit low. Usually the sweet spot is ~1s from my understanding.

Another argument for allowing to specify argon2 parameters is that hardware advances. What may take ~1s today, could take ~250ms in a couple years. In that case, the parameters would want to be increased. From my knowledge, monitoring algorithm times and adjusting parameters is a fairly common thing.

I guess to flip the question, did you have a specific reason for not allowing parameters to be specified? The parameters are still going through validation, so it should be easy to tell to someone if they are trying to use parameters that are incompatible.

nikgraf commented 1 day ago

To answer your question: From my experience sane defaults and less options is usually better. Especially when it comes to cryptography. I think us exploring this is a good example how complex it is and how is to screw it up.

We could still make it flexible, but the most important thing should be to have good defaults. The 2GB memory is indeed concerning. Will see if I can check with other people who use OPAQUE in production and what's their experience.