Open kayabaNerve opened 1 year ago
To create an official statement of awareness and policy:
Intel CPUs do not guarantee instructions are constant time per their inputs. There is a flag which can be set to improve this, yet it still isn't guaranteed.
The recent Kyber Slash disclosures highlighted how AMD CPUs and SiFive's RISC-V use additional cycles for their division operations, depending on the numerators.
Compiler backends and architecture targets will always risk introducing non-constant-time transformations.
Serai should consider itself constant time if it can be compiled in a way in which it is constant time, demonstrating the Serai code itself is constant time and variable timing is due to extraneous factors. While Serai should make an effort to mitigate extraneous factors (and apparently not use /
), the usage of /
should not be considered as the code itself fundamentally not being constant time.
Reference hardware of WASM makes sense to me, or RISC-V as there are RISC-V interpreters available who'd offer a much simpler specification.
https://github.com/phayes/sidefuzz counts cycles via wasm to ensure consistency, regardless of inputs used. We can use it, or potentially build our own minimal version of it.
Then integrating DudeCT or similar in the future would be great.