farizrahman4u / seq2seq

Sequence to Sequence Learning with Keras
GNU General Public License v2.0
3.17k stars 845 forks source link

theano.gradient.DisconnectedInputError #148

Open binonymous opened 7 years ago

binonymous commented 7 years ago

Hi all,

My Keras is using Theano backend. Encountered the following error:

theano.gradient.DisconnectedInputError: grad method was asked to compute the gradient with respect to a variable that is not part of the computational graph of the cost, or is used only by a non-differentiable operator: dense_1_W

My network structure:


Layer (type) Output Shape Param # Connected to

input_10 (InputLayer) (None, 30, 6) 0


timedistributed_10 (TimeDistribu (None, 30, 50) 350 input_10[0][0]


recurrentcontainer_19 (Recurrent [(None, 50), None, No 20200 timedistributed_10[0][0]


dense_20 (Dense) (None, 2) 102 recurrentcontainer_19[0][0]


recurrentcontainer_20 (Recurrent (None, 3, 2) 10702 dense_20[0][0]
dense_20[0][0]
recurrentcontainer_19[0][1]
recurrentcontainer_19[0][2]

Total params: 31354


And details of the error:

Traceback (most recent call last): File "stock_predict.py", line 146, in <module> train_logs = ens_nn_model.train(train_data, nb_epoch=10, val_data=test_data, model_dir=g_RESULT_DIR) File "/data1/basetech_f/benbinwu/T+N/smartquant_proj/stock_predictor/neural_network.py", line 154, in train train_mode, nb_epoch, (val_data.x, val_data.y), verbose) File "/data1/basetech_f/benbinwu/T+N/smartquant_proj/stock_predictor/neural_network.py", line 132, in _train verbose=verbose) File "/data1/basetech_f/benbinwu/T+N/smartquant_proj/stock_predictor/model_factory.py", line 103, in train validation_data=validation_data, verbose=verbose) File "/usr/lib64/python2.7/site-packages/keras/engine/training.py", line 1083, in fit self._make_train_function() File "/usr/lib64/python2.7/site-packages/keras/engine/training.py", line 696, in _make_train_function self.total_loss) File "/usr/lib64/python2.7/site-packages/keras/optimizers.py", line 200, in get_updates grads = self.get_gradients(loss, params) File "/usr/lib64/python2.7/site-packages/keras/optimizers.py", line 62, in get_gradients grads = K.gradients(loss, params) File "/usr/lib64/python2.7/site-packages/keras/backend/theano_backend.py", line 825, in gradients return T.grad(loss, variables) File "/usr/lib64/python2.7/site-packages/theano/gradient.py", line 545, in grad handle_disconnected(elem) File "/usr/lib64/python2.7/site-packages/theano/gradient.py", line 532, in handle_disconnected raise DisconnectedInputError(message) theano.gradient.DisconnectedInputError: grad method was asked to compute the gradient with respect to a variable that is not part of the computational graph of the cost, or is used only by a non-differentiable operator: dense_1_W Backtrace when the node is created: File "/data1/basetech_f/benbinwu/T+N/smartquant_proj/stock_predictor/seq2seq/models.py", line 254, in Seq2Seq encoded_seq = dense1(input) File "/usr/lib64/python2.7/site-packages/keras/engine/topology.py", line 491, in __call__ self.build(input_shapes[0]) File "/usr/lib64/python2.7/site-packages/keras/layers/wrappers.py", line 98, in build self.layer.build(child_input_shape) File "/usr/lib64/python2.7/site-packages/keras/layers/core.py", line 727, in build name='{}_W'.format(self.name)) File "/usr/lib64/python2.7/site-packages/keras/initializations.py", line 60, in glorot_uniform return uniform(shape, s, name=name) File "/usr/lib64/python2.7/site-packages/keras/initializations.py", line 33, in uniform return K.random_uniform_variable(shape, -scale, scale, name=name) File "/usr/lib64/python2.7/site-packages/keras/backend/theano_backend.py", line 142, in random_uniform_variable dtype=dtype, name=name) File "/usr/lib64/python2.7/site-packages/keras/backend/theano_backend.py", line 67, in variable return theano.shared(value=value, name=name, strict=False)

Looking forward to advice!

farizrahman4u commented 7 years ago

Code?

binonymous commented 7 years ago
ens_nn_model = Seq2Seq(input_shape=(n_timesteps, n_input_dim),
        hidden_dim=50,
        output_length=T_plus_N,
        output_dim=2)
ens_nn_model.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])
ens_nn_model.summary()
ens_nn_model.fit(train_data.x, train_data.y, nb_epoch=10,
        batch_size=1024,
        validation_data=(test_data.x, test_data.y), verbose=0)

Network structure:

____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_1 (InputLayer)             (None, 30, 6)         0                                            
____________________________________________________________________________________________________
timedistributed_1 (TimeDistribut (None, 30, 50)        350         input_1[0][0]                    
____________________________________________________________________________________________________
recurrentcontainer_1 (RecurrentC [(None, 50), None, No 20200       timedistributed_1[0][0]          
____________________________________________________________________________________________________
dense_2 (Dense)                  (None, 2)             102         recurrentcontainer_1[0][0]       
____________________________________________________________________________________________________
recurrentcontainer_2 (RecurrentC (None, 3, 2)          10702       dense_2[0][0]                    
                                                                   dense_2[0][0]                    
                                                                   recurrentcontainer_1[0][1]       
                                                                   recurrentcontainer_1[0][2]       
====================================================================================================
Total params: 31354
____________________________________________________________________________________________________