farizrahman4u / seq2seq

Sequence to Sequence Learning with Keras
GNU General Public License v2.0
3.17k stars 845 forks source link

ValueError: An operation has `None` for gradient. #279

Open cui-xiaoang96 opened 4 years ago

cui-xiaoang96 commented 4 years ago

I meet a problem about gradient!!!

the code is :

` from seq2seq import SimpleSeq2Seq, Seq2Seq, AttentionSeq2Seq import numpy as np

input_length = 5 input_dim = 3

output_length = 3 output_dim = 4

samples = 100 hidden_dim = 24

x = np.random.random((samples, input_length, input_dim)) y = np.random.random((samples, output_length, output_dim))

model = SimpleSeq2Seq(input_shape=(5, 3), hidden_dim=10, output_length=3, output_dim=4, depth=(4, 5))

model.compile(loss='mse', optimizer='sgd') model.fit(x, y, nb_epoch=10) ` And the error is:

Traceback (most recent call last):

File "", line 1, in model.fit(x, y, nb_epoch=10)

File "E:\Anaconda\envs\tf2\lib\site-packages\keras\engine\training.py", line 1213, in fit self._make_train_function()

File "E:\Anaconda\envs\tf2\lib\site-packages\keras\engine\training.py", line 316, in _make_train_function loss=self.total_loss)

File "E:\Anaconda\envs\tf2\lib\site-packages\keras\legacy\interfaces.py", line 91, in wrapper return func(*args, **kwargs)

File "E:\Anaconda\envs\tf2\lib\site-packages\keras\optimizers.py", line 259, in get_updates grads = self.get_gradients(loss, params)

File "E:\Anaconda\envs\tf2\lib\site-packages\keras\optimizers.py", line 93, in get_gradients raise ValueError('An operation has None for gradient. '

ValueError: An operation has None for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval.

So, what should I do to solve it.

Thanks for any help.

shaoxiang commented 4 years ago

I solved this problem by reducing the tensorflow version from 2.x to 1.x

guodong324 commented 3 years ago

I solved this problem by reducing the keras version to 2.2.0