tf-encrypted / moose

Secure distributed dataflow framework for encrypted machine learning and data processing
Apache License 2.0
59 stars 16 forks source link

Probabilistic rounding when encoding fixed-points #521

Open mortendahl opened 3 years ago

mortendahl commented 3 years ago

Using probabilistic rounding instead of flooring when encoding fixed-points seems to have some benefits from the ML perspective re accuracy. This is noted in for instance KS'21.

This issue is about investigating these claims and potentially updating the code base to use probabilistic rounding.

mortendahl commented 2 years ago

@yanndupis @jvmncs @ekloberdanz maybe something for you to look into during the current focus on accuracy?

ekloberdanz commented 2 years ago

@mortendahl @yanndupis @jvmncs That is a good idea. Stochastic rounding has been leveraged for post training quantization, where we quantize neural network weight and or activations from f32 to int8 or lower to decrease memory usage and latency during inference. This paper proposed adaptive rounding: AdaRound And here is a comprehensive pre-print on stochastic rounding, which is great not only for fixed point computations, but also for low bit width float computations: Stochastic Rounding: Implementation, Error Analysis, and Applications. It was recommended to me by one of the co-authors, Nick Higham, he is one of the top experts on numerical analysis.