snwagh / falcon-public

Implementation of protocols in Falcon
91 stars 46 forks source link

Batch Normalization protocol does not match code implementation. #22

Open llCurious opened 2 years ago

llCurious commented 2 years ago

Hey, snwagh. I have been reading your paper of Falcon and found this repo. And I am interested in how you perform the computation of Batch Normalization.

I have the following two questions:

  1. the implementation of BN seems to be just a single division image image

  2. the protocol of Pow seems to reveal the information of the exponent, i.e., \alpha

  3. the BIT_SIZE in your paper is 32, which seems to be too small. How you guarantee the accuracy or say precision? IS the BN actually essential to your ML training and inference?

snwagh commented 2 years ago

@llCurious I've added responses in order for your questions below:

llCurious commented 2 years ago

Thank you for you responses. Do you mean the division protocol currently can only handle the case where the exponent of the divisor b is the same? Or say, if the divisors in the vector have different exponents, then the current division protocol fails?

BTW, you seem to miss my question about the BN protocol. You mention that a larger bit-width or adaptive setting of the fixed-point precision. can be helpful in end-to-end training, do you mean to employ BN to tackle this problem?

snwagh commented 2 years ago

Yes, that is correct. Either all the exponents have to be the same or the protocol doesn't really guarantee any correctness.

About your BN question, like I said, end-to-end training in MPC was not studied (still many open challenges for that) so it is hard to make a comment empirically on the use of BN for training. However, the use of BN is known from ML literature (plaintext) and the idea is that the benefits of BN (improving convergence/stability) will translate into secure computation too. Does this answer your question? If you're asking if BN will help train a network in the current code base then I'll say no, though it is an issue, it is not the only issue that is preventing training.

llCurious commented 2 years ago

OK,i got it. Sry for the late reply~

Really thanks for your patient answers!!!!

snwagh commented 2 years ago