The rust implementation diverges from the old C++ implementation with regards to the maximum size of plaintext modulus T. Since correctness of the parameters was obtained running practical experiments using the C++ code, it is necessary to validate the correct value for T. In order to do that, it is necessary to compute the theoretical estimations and compare with the expected value from both implementations. Two cases are possible:
the C++ code was wrong, and indeed T is a bit smaller than expected. This is reasonable, since the old implementation had not focused on noise sampling or optimality of instantiation. So given the difference is small, this is the main possibility.
the noise growth (or some other problem) makes the rust code wrong, in this case we would have to solve this issue as part of the deliverable.
Ideally, the theoretically estimation should be implemented, such that we can use it to guarantee to we are using maximal T.
Specifically, larger values of T cause failures in the homomorphic positive_multiplication_test() (but not the negative multiplication, addition, keygen, encryption or decryption tests).
The rust implementation diverges from the old C++ implementation with regards to the maximum size of plaintext modulus T. Since correctness of the parameters was obtained running practical experiments using the C++ code, it is necessary to validate the correct value for T. In order to do that, it is necessary to compute the theoretical estimations and compare with the expected value from both implementations. Two cases are possible:
Ideally, the theoretically estimation should be implemented, such that we can use it to guarantee to we are using maximal T.
_Originally posted by @teor2345 in https://github.com/Inversed-Tech/eyelid/pull/98#discussion_r1625281345_
Symptoms of this bug
Specifically, larger values of
T
cause failures in the homomorphicpositive_multiplication_test()
(but not the negative multiplication, addition, keygen, encryption or decryption tests).