RoboticsClubIITJ / ML-DL-implementation

An implementation of ML and DL algorithms from scratch in python using nothing but NumPy and Matplotlib.
BSD 3-Clause "New" or "Revised" License
49 stars 69 forks source link

Convert Activations from simple functions to classes and add gradient method. #60

Closed rohansingh9001 closed 3 years ago

rohansingh9001 commented 3 years ago

Currently, all the activations in activations.py are simple functions.

However, for future implementation of Neural Networks, we will also need derivatives methods of each of these functions. You can have a look into loss_func.py for reference. There, each class represents a loss function and has both loss and derivative methods. You have to implement something similar for activations.

rohansingh9001 commented 3 years ago

Converting all activations is not a requirement however you may implement only one or as many you prefer. Each implementation should be in different PR's. If you make a single PR implementing more than one function, I will only give your points for 1 easy contribution. So make sure you capitalise on this 😉.

Abjcodes commented 3 years ago

Hey @rohansingh9001, I have raised a PR after converting the sigmoid function. Please review it and notify me if there are any changes to be made so that I can convert the rest accordingly.

Abjcodes commented 3 years ago

Hey @rohansingh9001, @agrawalshubham01, could you guys reopen this issue since its not completed or should I open another issue for this?

agrawalshubham01 commented 3 years ago

@Abjcodes now you can contribute here.

Abjcodes commented 3 years ago

Hey @rohansingh9001, @agrawalshubham01, I noticed there are still some activations that are yet to be converted. Should I raise multiple PRs or only one?

kwanit1142 commented 3 years ago

@Abjcodes raise single PR, as it will be easier for us to get close look on multiple models , simultaneously.

Abjcodes commented 3 years ago

Hey @kwanit1142 @rohansingh9001 @agrawalshubham01, Please check the PR. Also, I need some clarification regarding the gradient of heviside function. Is it okay to include Diracdelta using the sympy module? I will fix the failing checks along with the changes to be made if there are any.

Abjcodes commented 3 years ago

Hey @rohansingh9001 @agrawalshubham01, Please check the PR.