Closed helblazer811 closed 1 year ago
A potential way to define activation functions could be as follows:
class ActivationFunction(ABC):
def __init__(self, function_name=None, x_range=[-1, 1], y_range=[-1, 1]):
self.function_name = function_name
self.x_range = x_range
self.y_range = y_range
@abstractmethod
def apply_function(self, x_val):
"""Evaluates function at given x_val"""
pass
def make_evaluate_animation(self, x_val=None):
"""Evaluates the function at a random point in the x_range"""
if x_val == None:
x_val = random.uniform(self.x_range[0], self.x_range[1])
# Make an axis
# Surround the axis with a rounded rectangle.
# Plot function on axis by applying it and showing in given range
# Evaluate the function at the x_val and show a highlighted dot
class ReLUFunction(ActivationFunction):
def __init__(self):
pass
def apply_function(self, x_val):
if x_val < 0:
return 0
else:
return x_val
I made a working beta version of this for convolutional layers.
This is done as of 301b230c73eb9515f81aaa7fac1f67a574b2d7d0
I want to visualize activation functions. I can envision two different ways of doing this.
I am in favor of showing the activation function above existing layers because it will not interfere with existing visualizations like FeedForwardToFeedForward Convolutional3dToConvolutional3D.
A possible way to allow for this would to be to add a string optional parameter to layers like FeedForward and Convolutional3d.
Another way of doing it could be to pass a function or callable object (the callable object could also have a name) instead of a string, which would allow for custom activation functions.
A small coordinate frame with a function visualization can be shown above the layer that is being "activated" and that small function can be highlighted whenever there is a forward pass.