tf-encrypted / moose

Secure distributed dataflow framework for encrypted machine learning and data processing
Apache License 2.0
58 stars 16 forks source link

Add ReLU activation function for neural networks #1029

Closed ekloberdanz closed 2 years ago

ekloberdanz commented 2 years ago

This is a placeholder ReLU implementation. It is implemented directly in the edsl. In the next iteration, we will add ZerosOp to replace zeros = OnesOp - OnesOp. And eventually, ReLU will be implemented directly in Moose via checking the sign bits in inputs into ReLU.