farizrahman4u / seq2seq

Sequence to Sequence Learning with Keras
GNU General Public License v2.0
3.17k stars 846 forks source link

Setting Activations on layers #260

Open ZadravecM opened 5 years ago

ZadravecM commented 5 years ago

Hi,

I am working with seq2seq library, and I have some problems,....

I have vector that have values inside from 1 to 11399,... so my (training) vector is looking like : [1, 200, 1235, 11300,...]

But I always get back ('predicted') values from 0 to 1... like [0, 0.2, 0.3, 1,..]

I guess this is due to a activation function (softmax?). Is there a way to defined a activation function for seq2seq model layers?

Marko

mladl commented 5 years ago

I met the same problem. Did you solve it?

yyb1995 commented 5 years ago

I think it's because there's a W2 dense layer after the normal lstm layer. 45th line of cell.py is y = Activation(self.activation)(W2(h)). actually h is the output of lstmcell. And y is the output of W2. So you can try to change the activation function of this line.

mladl commented 5 years ago

I think it's because there's a W2 dense layer after the normal lstm layer. 45th line of cell.py is y = Activation(self.activation)(W2(h)). actually h is the output of lstmcell. And y is the output of W2. So you can try to change the activation function of this line.

I think you are right. The default activation function is tanh. Also normalization the data is another choice to solve the problem.