cazala / synaptic

architecture-free neural network library for node.js and the browser
http://caza.la/synaptic
Other
6.91k stars 666 forks source link

Weighted Loss Function #292

Open markoarnauto opened 6 years ago

markoarnauto commented 6 years ago

First of all: great library! The clean code helps me to understand a bit about neural nets.

My Questions: How to implement a weighted loss function for unbalanced training data as mentioned here: stackoverflow My attempt was using a cutsom cost function like this:

var weight = class1 / class2;
cost: function(target, output) {
    var weighted_crossentropy = 0;
    for (var i in output) {
      weighted_crossentropy -= weight * target[i] * Math.log(output[i] + 1e-15) + (1 - target[i]) * Math.log(1 + 1e-15 - output[i]);
    } 
    return weighted_crossentropy;
  }

Shouldn't the network then be optimized against the cost function? But the propagate method is not taking the cost function into account. Am I missunderstanding something?

ghost commented 6 years ago

Cost functions like MSE and friends are basically an optical representation and interpretation of the networks performance for you only. You would use the result of the cost function to decide to train the networks more or not, to optimize the prediction using different training data etc, but the network itself works hard to learn as good as it can, and for that it needs no cost function. A Neuron has its own in build in error rate function for which it gets optimized.

Check the XOR example. It just needs enough iterations to learn, but it does not need any cost function to work just fine.

markoarnauto commented 6 years ago

The update policy is: w(τ+1) = w(τ) − η∇E(w(τ))

where w(τ+1) is the new weight and w(τ) the old one. η is the learning rate and E the cost function.

So the "built in error function" of the neuron is the derivative of the cost function, right? So the cost function is intrinsic for backpropagation.