keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
62.05k stars 19.48k forks source link

how to write my own loss function ? #4462

Closed superYangwenwen closed 7 years ago

superYangwenwen commented 7 years ago

I want to use my own loss to compute gradient. The loss function is listed as below.

def my_loss(y_true, y_pred):

   def step(y_true_step, y_pred_step):
          true, pred = label_pre(y_true_step, y_pred_step)
          loss = cal_loss(true, pred)
          return loss

   loss,_ = theano.scan(
       fn=step,
       outputs_info=None,
      sequences=[y_true, y_pred]
     )
    return T.mean(loss)

However, I met the error "DisconnectedInputError(message)". Anyone knows the reason?

Traceback (most recent call last): File "exam/cnn_lstm_ctc/cnn_lstm_ctc.py", line 379, in main() File "exam/cnn_lstm_ctc/cnn_lstm_ctc.py", line 356, in main validation_data=(X_valid, Y_valid)) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/models.py", line 640, in fit sample_weight=sample_weight) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/engine/training.py", line 1103, in fit self._make_train_function() File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/engine/training.py", line 720, in _make_train_function self.total_loss) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/optimizers.py", line 310, in get_updates grads = self.get_gradients(loss, params) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/optimizers.py", line 62, in get_gradients grads = K.gradients(loss, params) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/backend/theano_backend.py", line 825, in gradients return T.grad(loss, variables) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Theano-0.9.0.dev2-py2.7.egg/theano/gradient.py", line 533, in grad handle_disconnected(elem) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Theano-0.9.0.dev2-py2.7.egg/theano/gradient.py", line 520, in handle_disconnected raise DisconnectedInputError(message) theano.gradient.DisconnectedInputError:
Backtrace when that variable is created:

File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/models.py", line 325, in add output_tensor = layer(self.outputs[0]) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/engine/topology.py", line 493, in call self.build(input_shapes[0]) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/layers/wrappers.py", line 98, in build self.layer.build(child_input_shape) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/layers/convolutional.py", line 408, in build self.W = self.init(self.W_shape, name='{}_W'.format(self.name)) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/initializations.py", line 59, in glorot_uniform return uniform(shape, s, name=name) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/initializations.py", line 32, in uniform return K.random_uniform_variable(shape, -scale, scale, name=name) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/backend/theano_backend.py", line 142, in random_uniform_variable dtype=dtype, name=name) File "/search/yangwenwen/anaconda2/lib/python2.7/site-packages/Keras-1.1.1-py2.7.egg/keras/backend/theano_backend.py", line 67, in variable return theano.shared(value=value, name=name, strict=False)

happygds commented 7 years ago

You should search previous issues first #2662

stale[bot] commented 7 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.