Closed GoogleCodeExporter closed 9 years ago
Problem in package org.encog.neural.activation.ActivationTANH in method private
double activationFunction(double d); If d is big (>350 in my PC), then TANH
function
(Math.exp(d * 2.0) - 1.0) / (Math.exp(d * 2.0) + 1.0) get NaN
Original comment by struzh...@gmail.com
on 3 Apr 2009 at 11:16
I think this was fixed for 2.0, but I will look into it.
Original comment by JeffHeat...@gmail.com
on 8 Jul 2009 at 1:20
Fixed some time ago.
Original comment by heatonre...@gmail.com
on 14 Aug 2010 at 2:11
Hi,
I still have the NaN Problem with a tanh activation function if there is more
then one tanh layer. For example the following example comprising of 5 layers
does not work:
network.addLayer(new BasicLayer(new ActivationSigmoid(),false,data[0].length));
network.addLayer(new BasicLayer(new ActivationTANH(),false,HIDDEN_DIM_SIG));
network.addLayer(new BasicLayer(new ActivationLinear(),false,HIDDEN_DIM_STEP));
network.addLayer(new BasicLayer(new ActivationTANH(),false,HIDDEN_DIM_SIG));
network.addLayer(new BasicLayer(new ActivationSigmoid(),false,HIDDEN_DIM_SIG));
However, if I use sigmoid and linear only, everything works as expected. After
a few experiments I think the problem is having layers with the following
activation functions in order: tanh - linear - tanh.
What was the reason of the previous bug? How was it solved?
Using newest version 2.4.3.
Thanks!
Original comment by goldstein.iupr
on 24 Sep 2010 at 2:59
Original issue reported on code.google.com by
struzh...@gmail.com
on 3 Apr 2009 at 8:26Attachments: