CodingTrain / Toy-Neural-Network-JS

Neural Network JavaScript library for Coding Train tutorials
MIT License
425 stars 242 forks source link

Activation functions as static context instead of constant #78

Open xxMrPHDxx opened 6 years ago

xxMrPHDxx commented 6 years ago

We are currently using constant to declare those activation functions. I think it will be much better if we use static context instead to declare them.

Here I included the code in my gist as a reference.

Adil-Iqbal commented 6 years ago

I agree with the idea that the activation functions should be moved into their own file. There are a ton of activation functions that have yet to be implemented because of the current limitations of the ActivationFunction class. The class itself will probably go through several changes in the future, especially if we are implementing activation functions with alpha and lambda values. Not to mention if we implement activations that go past a single-fold x, like the softmax and maxout activations.