RoboticsClubIITJ / ML-DL-implementation

An implementation of ML and DL algorithms from scratch in python using nothing but NumPy and Matplotlib.
BSD 3-Clause "New" or "Revised" License
48 stars 69 forks source link

Other variants of Relu can be added in the script activation.py #69

Closed parva-jain closed 3 years ago

parva-jain commented 3 years ago

I'm willing to add other variants of the Relu activation function for a wide range of applications of this package.

rohansingh9001 commented 3 years ago

@parva-jain Sure you can work on it. Before that can you please give a brief description of what all you are going to add?

parva-jain commented 3 years ago

I have heard of Parametric Relu(PRelu) and Scaled ELU(SELU) functions but only able to find out their Keras implementation till now. Also, there can be added more activation functions like a binary step function and a swish function.

rohansingh9001 commented 3 years ago

@parva-jain sure go ahead, you can work on them. Make sure to follow PEP 8 formatting and correctness of your code.

kwanit1142 commented 3 years ago

LGTM too.