nerd-toolkit / nerd

http://nerddoc.x-bot.org
Other
8 stars 0 forks source link

setTransferFunction in scripting causes nerd to freeze #28

Open htoutounji opened 11 years ago

htoutounji commented 11 years ago

I am writing a script for a transfer function on nerd where I change the bias term according to some learning rule. I still want to use the hyperbolic tangent transfer function. Loading the following script is causing nerd to freeze:

var inSynapses;

function reset() { net.setTransferFunction(neuron, "tanh"); inSynapses = net.getInSynapses(neuron); }

function calc(activation) { activation = theta = net.getBias(neuron);

sumW = 0.0; for (var i=0; i < inSynapses.length; ++i) { sumW += net.getWeight(inSynapses[i]); }

theta -= eta*(theta + sumW);
net.setBias(neuron,theta);

//nerd.error(getTransferFunctionName(206));

return activation;

}

loading the same script but with the statement "net.setTransferFunction(neuron, "tanh");" commented works. Uncommenting it after loading also causes freeze. Is it a bug or am I using the statement wrongly?

cybott commented 11 years ago

You cannot replace the scripted transfer function during its own reset() call by another transfer function. This will lead to undefined behavior (actually, it should generate a warning instead of a freeze :) ).

Instead, do not use an external transfer function to calculate the tanh(). Just call a custom local function myTanh() in which you implement the tanh equation, and return its result, e.g.

return myTanh(activation);

However, it seems not necessary to use a ScriptedTranserFunction for this job. You could also use a ScriptedActivationFunction. In that case, you can use the setDefaultActiationFunction() method (see documentation http://www.ultopia.de/drupal/nerddoc/node/78#setDefaultActivationFunction).

On 10/08/2013 05:07 PM, htoutounji wrote:

I am writing a script for a transfer function on nerd where I change the bias term according to some learning rule. I still want to use the hyperbolic tangent transfer function. Loading the following script is causing nerd to freeze:

var inSynapses;

function reset() { net.setTransferFunction(neuron, "tanh"); inSynapses = net.getInSynapses(neuron); }

function calc(activation) { activation = theta = net.getBias(neuron);

sumW = 0.0; for (var i=0; i < inSynapses.length; ++i) { sumW += net.getWeight(inSynapses[i]); }

|theta -= eta*(theta + sumW); net.setBias(neuron,theta);

//nerd.error(getTransferFunctionName(206));

return activation; |

}

loading the same script but with the statement "net.setTransferFunction(neuron, "tanh");" commented works. Uncommenting it after loading also causes freeze. Is it a bug or am I using the statement wrongly?

— Reply to this email directly or view it on GitHub https://github.com/nerd-toolkit/nerd/issues/28.

htoutounji commented 11 years ago

Thanks! I'll write a local function then.

The reason why I am not using ScriptedActivationFunction is that the theta dynamics I am trying to implement are on top of SRN and I didn't want to rewrite the SRN activation dynamics. I could set it as default but then I will lose the interface to observe variables and change equations and parameters. Also, these theta dynamics cannot be implemented with a single formula in the scripted SRN version because it needs to sum up incoming weights to the neuron which requires a for-loop.