Closed ZeHuangFang closed 4 years ago
Nothing much happens. You can of course use it directly. Keras custom layers like LeakyRelu are however built for activation functions with arguments - here w0. So I used it as a layer.
Yes, it works fine as a normal function! I plan to extend it to RNN's network, for better signal processing. Thanks again!
Thank you for your contribution!
I saw in the SinusodialRepresentationDense layer that the Sine activation function was passed as a network layer? Is there any benefit to doing this?
And what happens if I change it to normal function?
Thank you very much!