Benardi / touvlo

:robot: ML algorithms implemented from scratch and provided block by block
https://touvlo.readthedocs.io/en/latest/
MIT License
12 stars 18 forks source link

Add tanh, relu activation functions 🚧 #30

Closed ghost closed 4 years ago

ghost commented 5 years ago

Description

Added ReLu and TanH activation functions. Fixes #28

Type of change

How Has This Been Tested?

I have performed the unit test for both activation functions on several functions of the touvlo/supv/nn_clsf.py file and added these tests to tests/supv/test_nn_clsf.py.

Checklist:

  • [•] Implementation

  • Implement Tanh and ReLU activation functions

  • Edit Classification Neural Network to allow one to choose an activation function for each layer

  • [•] Testing

  • Edit Classification Neural Network's test suite to conform to updates to module

  • Add test cases for new functions and module routines that employ them

  • [•] Documentation

  • Update documentation of edited functions

  • Add documentation for new functions

TODO:

  • Separate activation functions for each hidden layer (currently all hidden layer will have given activation function).
  • Store the activation function used so not to pass as argument every-time on successive function calls.