Closed jon-chuang closed 4 years ago
You are absolutely right. CKKS bootstrapping is not bootstrapping in Gentry's sense. Specifically, it does not allow you to do unlimited computation because it accrues so much error that the message will be quickly gone -- even after two bootstrappings. There is only one use-case that I know of where bootstrapping might make sense: gradient descent approaches to learning of simple ML models, where the computation itself helps to correct the error coming from bootstrapping. This misunderstanding, as well as the highly complex parameterization, are reasons why we have not released CKKS bootstrapping in SEAL.
Hi Kim,
Actually, could you elaborate a little more on this point? From what I understand the polynomial approximation to the modular reduction introduces some errors, but this is bounded. If one chooses the right parameters, one could still have many levels left (e.g. 19 out of 29). However, this is with very much reduced bit-precision (~10 bits). Perhaps this is a fundamental challenge that still needs to be overcome?
Is bootstrapping available in the newest release 4.0.0? There seems to be little discussion on this topic.
Hi Kim,
Actually, could you elaborate a little more on this point? From what I understand the polynomial approximation to the modular reduction introduces some errors, but this is bounded. If one chooses the right parameters, one could still have many levels left (e.g. 19 out of 29). However, this is with very much reduced bit-precision (~10 bits). Perhaps this is a fundamental challenge that still needs to be overcome?
I believe that you have explained Kim's previous statement. Your argument "this is with very much reduced bit-precision" says that bootstrapping does not recover the whole precision. Then after a few bootstrapping operations, all precision is gone.
Is bootstrapping available in the newest release 4.0.0? There seems to be little discussion on this topic.
No, we do not have a current plan to release CKKS bootstrapping.
one chooses the right parameters
Hi, is "one chooses the right parameters" meaning bootstrapping?
one chooses the right parameters
Hi, is "one chooses the right parameters" meaning bootstrapping?
No, see the example 1_bfv_basics
to understand what it means by choosing the right parameters.
I am now interested in implementing and accelerating bootstrapping SEAL, as it seems like a more practical route to my goal (deep circuits) than increasing levels, which comes at computational and memory costs. I am interested in continuing to use large n, to reduce the number of bootstraps required.
If we can get a 100x speedup over current single-core benchmarks simply by accelerating on hardware and exploiting parallelism, bootstrapping could be reduced to ~1s, which might be tolerable, especially since the SqueezeNet evaluation by EVA running on 56 logical cores has a latency of ~1s.
I have to check what the size of the tensor in the intermediate layer is to see how many bootstraps need to be evaluated. In the SqueezeNet case, if we perform a bootstrap right after a squeeze layer, we may be able to get by with just one bootstrap to add 10 more levels. etc. so maybe 2 bootstraps to go up to 10 fire modules from current 4. With some luck, the latency can stand at 6s for a depth 20 network.
From the scientific point of view, I am interested about the compounded arithmetic error for deeper circuits. I don't believe this arithmetic error is alleviated by bootstrapping. I believe it is already significant for SqueezeNet.