junaidmalik09 / fastonn

FastONN - Python based open-source GPU implementation for Operational Neural Networks
GNU General Public License v3.0
22 stars 9 forks source link

SelfONN1d precedence layer #11

Open mdzalfirdausi opened 1 year ago

mdzalfirdausi commented 1 year ago

It's written that: A SelfONN2dLayer expects input to be bounded between [-1,1]. To respect this, a Tanh or Sigmoid activation layer must preceed it.

what if SelfONN1d is used, does it have same requirement as SelfONN2dLayer?

junaidmalik09 commented 1 year ago

yes, SelfONN1d has the same requirements as the SelfONN2dLayer. Please note that this is necessary, especially when working with high q values because the input gets raised up to the qth power and, if not bounded between [-1, 1], can explode. In practice, for low q values and shallow networks, ReLU activation layer can also be used.