Closed RicardoRibeiroRodrigues closed 2 months ago
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 94.99%. Comparing base (
cb45f4a
) to head (62ac483
). Report is 1 commits behind head on master.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Summary of change
This pull request introduces an implementation of the SoftMax algorithm. The SoftMax function is often used as the last activation function of a neural network to normalize the output of a network to a probability distribution over predicted output classes.
Definition
The SoftMax function takes as input a vector z of K real numbers, and normalizes it into a probability distribution consisting of K probabilities proportional to the exponentials of the input numbers.
Motivation
The SoftMax function is a critical component in many machine learning models, particularly in classification tasks within neural networks.
Time Complexity
The time complexity for the SoftMax algorithm is
O(n)
, where n is the number of elements in the input array.