fancompute / neuroptica

Flexible simulation package for optical neural networks
https://doi.org/10.1109/JSTQE.2019.2930455
MIT License
208 stars 40 forks source link

added ReLU #1

Closed twhughes closed 5 years ago

twhughes commented 5 years ago

I added a (discontinuous) ReLU activation of the form

  f(x_i) = alpha * x_i   if |x_i| <  cutoff
  f(x_i) = x_i                if |x_i| >= cutoff

This activation is nice because it has the property f(z)/z = f'(z), which means the backprop step can be done by simply propagating the error signal through the same system as the forward prop activation.

I want to test whether this activation can be useful in real problems.

Added tests, which passed.