Added ReLu and TanH activation functions.
Fixes #28
Type of change
[•] New feature (non-breaking change which adds functionality)
[•] This change requires a documentation update
How Has This Been Tested?
I have performed the unit test for both activation functions on several functions of the touvlo/supv/nn_clsf.py file and added these tests to tests/supv/test_nn_clsf.py.
Checklist:
[•] Implementation
Implement Tanh and ReLU activation functions
Edit Classification Neural Network to allow one to choose an activation function for each layer
[•] Testing
Edit Classification Neural Network's test suite to conform to updates to module
Add test cases for new functions and module routines that employ them
[•] Documentation
Update documentation of edited functions
Add documentation for new functions
TODO:
Separate activation functions for each hidden layer (currently all hidden layer will have given activation function).
Store the activation function used so not to pass as argument every-time on successive function calls.
Description
Added ReLu and TanH activation functions. Fixes #28
Type of change
How Has This Been Tested?
I have performed the unit test for both activation functions on several functions of the
touvlo/supv/nn_clsf.py
file and added these tests totests/supv/test_nn_clsf.py
.Checklist:
TODO: