Closed Sohammhatre10 closed 1 month ago
Didn't get adding the activation functions to model.py in https://github.com/ravin-d-27/PyDeepFlow/issues/15 as they are being called and the user while running runner.py has the option to enter the activation as ['relu', 'relu']
and change according to the ones available. The user can get it from pydeepflow.activations(required_activation)
. Just tell me if explicit input is required from the user.
Hi @Sohammhatre10 ! Thank you for the PR !
I will try running the runner.py
by changing the ['relu', 'relu']
into the newly introduced activation functions, and let you know !
Hi @Sohammhatre10
I tried to run runner.py
using the new activation functions.
Faced this issue. And I found that there is no abs()
function in device.py
.
Your work is great! But can you please add the abs function in device.py
and make a PR?
Sure will add abs
method to device.py
Summary: This PR adds several new activation functions and their derivatives to the
activation
andactivation_derivative
functions, updates documentation, and enhances unit tests in context to https://github.com/ravin-d-27/PyDeepFlow/issues/15.Key Changes: New activation functions have been added to the activation and activation_derivative functions. The new functions are:
Additionally, the existing Leaky ReLU function was modified to accept an alpha parameter, and some functions like Hardtanh and Hardsigmoid, which were not explicitly mentioned in the original code, are now included in the list of supported functions.
Testing: All changes validated with unittest, including edge cases.