cryspen / hacl-packages

The Cryspen HACL Distribution
https://cryspen.com/hacl-packages
Other
15 stars 18 forks source link

benchmarks: Sanity check of benchmarks. #337

Open duesee opened 1 year ago

duesee commented 1 year ago
duesee commented 1 year ago

@franziskuskiefer, do you want to provide input on this? Otherwise I could just start and use what I think is appropriate. Naming and order is not super important but let's do that as long as it's easily possible and makes the next steps easier.

franziskuskiefer commented 1 year ago

Some thought, but just go ahead

duesee commented 1 year ago

Try to make sure that only the actual function in question is measured, i.e. a little other things like allocations etc. For example, we don't care about the time it takes to load a public key to verify a signature. We don't have much influence on that. We want to know how long the function takes to verify a signature.

I think it would help to talk about comparisons (as with OpenSSL) and regression testing separately: To have comparable benchmarks, we should make sure that we "do the same" for HACL and, e.g., OpenSSL. For example, when the API call in HACL is unified such that it always hashes a message before signing, and OpenSSL hashes and signs in two steps, we should make sure to include the hashing step in OpenSSL. Otherwise we compare hashing+sign with sign-only.

For regression testing, I agree with your comment. We can put as much things as needed in the setup and only measure the single function we don't want to regress.

For some benchmarks comparisons and regression testing align, but not for all. I will take a look and point out these cases.