helblazer811 / ManimML

ManimML is a project focused on providing animations and visualizations of common machine learning concepts with the Manim Community Library.
MIT License
2.27k stars 136 forks source link

Activation Functions #13

Closed helblazer811 closed 1 year ago

helblazer811 commented 1 year ago

I want to visualize activation functions. I can envision two different ways of doing this.

  1. Show activation functions as their own layer.
  2. Show activation functions above existing layers with a certain symbol.

I am in favor of showing the activation function above existing layers because it will not interfere with existing visualizations like FeedForwardToFeedForward Convolutional3dToConvolutional3D.

A possible way to allow for this would to be to add a string optional parameter to layers like FeedForward and Convolutional3d.

nn = NeuralNetwork([
      ImageLayer(numpy_image, height=1.5),
      Convolutional3DLayer(1, 7, 7, 3, 3, filter_spacing=0.32, activation="relu"),
      Convolutional3DLayer(3, 5, 5, 3, 3, filter_spacing=0.32, activation="relu"),
      FeedForwardLayer(3, activation="sigmoid"),
    ],
    layer_spacing=0.25,
)

Another way of doing it could be to pass a function or callable object (the callable object could also have a name) instead of a string, which would allow for custom activation functions.


def step_function(input):
    return 0 if input < 0 else return 1

nn = NeuralNetwork([
      ImageLayer(numpy_image, height=1.5),
      Convolutional3DLayer(1, 7, 7, 3, 3, filter_spacing=0.32, activation=step_function),
      Convolutional3DLayer(3, 5, 5, 3, 3, filter_spacing=0.32, activation="relu"),
      FeedForwardLayer(3, activation="sigmoid"),
    ],
    layer_spacing=0.25,
)

A small coordinate frame with a function visualization can be shown above the layer that is being "activated" and that small function can be highlighted whenever there is a forward pass.

helblazer811 commented 1 year ago

A potential way to define activation functions could be as follows:


class ActivationFunction(ABC):

    def __init__(self, function_name=None, x_range=[-1, 1], y_range=[-1, 1]):
          self.function_name = function_name
          self.x_range = x_range
          self.y_range = y_range

    @abstractmethod
    def apply_function(self, x_val):
           """Evaluates function at given x_val"""
           pass

    def make_evaluate_animation(self, x_val=None):
           """Evaluates the function at a random point in the x_range"""
           if x_val == None:
                x_val = random.uniform(self.x_range[0], self.x_range[1])
           # Make an axis 
           # Surround the axis with a rounded rectangle. 
           # Plot function on axis by applying it and showing in given range
           # Evaluate the function at the x_val and show a highlighted dot

class ReLUFunction(ActivationFunction):

     def __init__(self):
           pass

     def apply_function(self, x_val):
            if x_val < 0:
                   return 0
            else:
                   return x_val
helblazer811 commented 1 year ago

I made a working beta version of this for convolutional layers.

https://user-images.githubusercontent.com/14181830/214576978-9f76c386-f330-4e7b-ae2e-d4d0671411ad.mp4

helblazer811 commented 1 year ago

This is done as of 301b230c73eb9515f81aaa7fac1f67a574b2d7d0