freenet / freenet-core

Declare your digital independence
https://freenet.org/
Apache License 2.0
2.13k stars 71 forks source link

Evaluate quantum safety of the cryptographic algorithms we're using #429

Open sanity opened 1 year ago

sanity commented 1 year ago

Quantum computers have the potential to break many of the cryptographic algorithms that are currently used to secure communications and protect data. This is because quantum computers can perform certain computations much faster than classical computers, which allows them to solve problems that are considered difficult or infeasible for classical computers.

Some cryptographic algorithms that are not quantum-safe include:

On the other hand, some cryptographic algorithms are believed to be quantum-safe because they are based on mathematical problems that are believed to be difficult for quantum computers to solve. These algorithms include:

It is important to note that the security of these quantum-safe algorithms is still being researched and is not yet fully understood. Some of these algorithms may eventually be broken by advances in quantum computing or by new attacks that have not yet been discovered.

In summary, some cryptographic algorithms are quantum-safe because they are based on mathematical problems that are believed to be difficult for quantum computers to solve, while other algorithms are not quantum-safe because they are vulnerable to attacks by quantum computers.

github-actions[bot] commented 1 year ago

Pivotal Tracker story: https://www.pivotaltracker.com/story/show/184058010

TheRook commented 1 year ago

Symmetric Ciphers are impacted by Grover's Algorithm which is only a quadratic-time computational shortcut - so 256bit keys are generally still considered to be safe even with a quite advanced quantum computer. Make sure Freenet doesn't use any 128bit symmetric ciphers, AES-256-GCM is the current industry recommendation and used in compliance.

ECC is particularly vulnerable to attacks mounted by a quantum computer, and this is well documented. DSA and RSA are also affected, but are more resistant than ECC. This is because the same efficiencies that make ECC less CPU intensive, also makes it faster for an attacker to factor the corresponding private key using shore's algorithm.

Now here is where things get really interesting. If I am not mistaken - not only is Freenet's use of ECC vulnerable to quantum attacks, it would allow an adversary to deface a Freenet website or otherwise impersonate a targeted publisher on the network. Freenet and many other dapps are more disproportionately affected because they rely on long-lived key pairs to serve as the identity. As a counter example, think of TLS - in the case of TLS the adversary still has to MITM the traffic which greatly limits who could exploit this issue, and TLS certificates typically expire after a year making the window for exploitation much smaller than compared to a bitcoin wallet or a Freenet identity which can be exploited at anywhere at any time.

So, officially NIST has not approved a post-quantum signature algorithm to replace ECC. but as of 2023 we are down to four finalists. The current guidance is to use as large of a asymmetric key as possilbe, and use it for as short as period of time as possible. Build-in future proofing and update as soon as one of the four NIST contestants are approved.

From a design perspective, the time starts ticking the moment the full public key is known to the attacker. If the attacker only has a signature, but not the public key to verify it - then he cannot start factoring the private key. What this means is that a peer can designate a new key that will be used by simply presenting the fingerprint of the next public key without revealing any additional information that would aid in favoring the key. This gives the user the option to jump to a fresh key that we know hasn't been compromised.

sanity commented 1 year ago

@TheRook Very helpful, thank you.

The current guidance is to use as large of a asymmetric key as you can, and use it for as short as period of time as possible

Can you quantify "large" in this context? For example, would a 256 bit ECC key be considered quantum safe? For some of the early systems we're considering we'll need to use a quantum safe algorithm that's capable of blind signatures, which RSA is and I believe EC is too.

Make sure your system is future proof and that you can update as soon as one of the four NIST contestants are approved.

That shouldn't be a problem, the only crypto algorithm hardwired in Locutus is the key hash algorithm (currently BLAKE2s-256), everything else is "software" that's under the control of developers building apps.

TheRook commented 1 year ago

Can you quantify "large" in this context? For example, would a 256 bit ECC key be considered quantum safe? For some of the early systems we're considering we'll need to use a quantum safe algorithm that's capable of blind signatures, which RSA is and I believe EC is too.

Ah so for ECC, each curve set you choose has an associated key length. For example, the NSA recommends p384 which produces 384-bit keys. And this is a recommended quantum hardening method until we have a NIST approve replacement: https://www.johndcook.com/blog/2019/05/11/elliptic-curve-p-384/

I can see why a project like Freenet might not want to take the advice of the NSA, and to be honest i'm not sure what the next best curve, but here is a good list with a detailed explanation of each one: https://www.johndcook.com/blog/2019/02/15/elliptic-curve-names/

TheRook commented 1 year ago

Other curves you may want to consider are E-521 and M-511 which should be able to support larger key sizes than NIST's recommended p-384. https://safecurves.cr.yp.to/

The rational from diverging from the NIST's guidance is that the adversary's most predictable advantage is simply having more money to buy a better computer. Increasing the key size will insure that it is as expensive as possible to attack our design. If everyone else is using P-384 or weaker, then any would be adversary should consider these easier targets before going after a much more difficult target. (So long as excessive asymmetric handshakes doesn't create a bottleneck.)

Additionally, there are other protections that can be added to ECC to help protect it from Shore's algorithm, one solution that is very interesting is S.I.K.E. : https://en.wikipedia.org/wiki/Supersingular_isogeny_key_exchange

Destroyinator69420 commented 1 year ago

If you need a bigger symmetric key you could use the AES successor Kalyna, which is the national symmetric encryption standard of Ukraine. It can use 512 bit keys, AES only goes up to 256 bits.

Destroyinator69420 commented 1 year ago

As for asymmetric encryption you could use Crystals Kyber and RSA. Just make sure the key size of RSA is very large, like 32768 bits to make sure it is futureproof. I find it okay that my locutus node will run slower as a result if it means cryptography securing my identity is futureproof.

iduartgomez commented 10 months ago

So here is a list of algorithms we are currently using and actions I am thinking we could take, considering things like ergonomics, maturity of available implementations, etc.:

My biggest question right now is what could we use for symmetric encryption. I choose chacha20 because it's performance for pure software implementations (without adequate hardware support) is usually faster than AWS, and the secutiry level is similar, but since we want to support multiple targets, which may not have proper hardware acceleration chacha20 seems like a good candidate.

FYI @sanity