brainpy / BrainPy

Brain Dynamics Programming in Python
https://brainpy.readthedocs.io/
GNU General Public License v3.0
508 stars 92 forks source link

softplus is not correct #580

Closed Dr-Chen-Xiaoyu closed 8 months ago

Dr-Chen-Xiaoyu commented 8 months ago

Hi, brainpy team:

Based on the source code (https://brainpy.readthedocs.io/en/latest/_modules/brainpy/_src/math/activations.html#softplus):

def softplus(x, beta=1, threshold=20):
  x = x.value if isinstance(x, Array) else x
  return jnp.where(x > threshold, x * beta, 1 / beta * jnp.logaddexp(beta * x, 0))

The softplus activation in brainpy is not correct in the reverted linear part, whose slop should always be 1. The behavior can be replicated with codes:

import matplotlib.pyplot as plt
import brainpy as bp
import brainpy.math as bm
softplus=bp.dnn.Softplus(beta=0.1) # try different beta
scale = 5e1
x=bm.linspace(-scale,scale,20001)
plt.plot(x,softplus(x))

image

Based on equation of softplus (https://en.wikipedia.org/wiki/Rectifier_(neural_networks)) and usage of threshold (https://pytorch.org/docs/stable/generated/torch.nn.Softplus.html#torch.nn.Softplus), It might be corrected by this:

def softplus(x, beta=1, threshold=20):
  x = x.value if isinstance(x, Array) else x
  return jnp.where( x > threshold/beta , x, 1 / beta * jnp.logaddexp(beta * x, 0))

or might just directly use jax's softplus, if it can be auto-graded by brainpy:

def softplus(x, beta=1):
  return jax.nn.softplus(beta*x)/beta

Best, XiaoyuChen, SJTU

chaoming0625 commented 8 months ago

Thanks for the report~

chaoming0625 commented 8 months ago

Please see #581.

Moreover, I have increased the default threshold to 40.

Currently, the softplus function is:

Figure_1