berkeleybop / artificial-intelligence-ontology

An ontology modeling classes and relationships describing deep learning networks, their component layers and activation functions, machine learning methods, as well as AI/ML potential biases.
https://berkeleybop.github.io/artificial-intelligence-ontology/
17 stars 3 forks source link

The Function hierarchy is potentially overly flat #116

Open realmarcin opened 3 months ago

realmarcin commented 3 months ago

Using Claude Opus for initial suggestions for parent classes for Function entities.

####### In the context of an AI ontology, these deep neural network (DNN) activation functions can be grouped into several categories based on their properties and behaviors. Here's a possible categorization:

Sigmoid-like Functions:

Sigmoid Function Hard Sigmoid Function Softmax Function (a generalization of the sigmoid function for multi-class classification) Rectified Functions:

ReLU (Rectified Linear Unit) Function ELU (Exponential Linear Unit) Function SELU (Scaled Exponential Linear Unit) Function Smooth Approximations to Rectified Functions:

Softplus Function Swish Function GELU (Gaussian Error Linear Unit) Function Hyperbolic Functions:

Tanh (Hyperbolic Tangent) Function Linear Functions:

Linear Function Other Functions:

Softsign Function Exponential Function