Here is the softmax function I wrote not so long ago:
/**
* Softmax function
* @method softmax
* @param z An array of numbers (vector)
* @return An array of numbers (vector)
**/
function softmax(z) {
let ans = [];
let denom = 0;
for (let j = 0; j < z.length; j++) {
denom += Math.exp(z[j]);
}
for (let i = 0; i < z.length; i++) {
let top = Math.exp(z[i]);
ans.push(top / denom);
}
return ans;
}
This function is not implemented in the repository yet.
For this function to work in a Neural Network, we would need to write the derivative of this function. This might be a difficult task since this function takes in & outputs vectors. Vectors that are represented as arrays.
These two functions would need to be implemented in src/core/functions/actfuncs.js.
For this function to work with a Dann model, we would need to change how to activations are handled since it expects a vector instead of a number value. I could work on that once the derivative is implemented.
Feature
Softmax activation function.
Type
Description
Here is the softmax function I wrote not so long ago:
This function is not implemented in the repository yet.
For this function to work in a Neural Network, we would need to write the derivative of this function. This might be a difficult task since this function takes in & outputs vectors. Vectors that are represented as arrays.
These two functions would need to be implemented in
src/core/functions/actfuncs.js
.For this function to work with a Dann model, we would need to change how to activations are handled since it expects a vector instead of a number value. I could work on that once the derivative is implemented.