Closed kgrm closed 8 years ago
Up Same error for me!
Your model apparently has a different behavior in training and test mode, and so needs to know what mode it should be using.
Use
iterate = K.function([input_img, K.learning_phase()], [loss, grads])
and pass 1
or 0
as value for the learning phase, based on whether you want the model in training mode or test mode.
I also have similar problem, but occurs when I run use fit() or train_on_batch(), I used K.in_train_phase(pos_score, neg_score) in a user defined layer, pos_score and neg_score are both (symbolic) computed form layer input x. Even if I use K.in_train_phase(pos_score, neg_score), it shows the same error:
MissingInputError Traceback (most recent call last)
<ipython-input-33-09ce76894494> in <module>()
----> 1 model.predict_on_batch([src, dst])
/home/chentingpc/anaconda/lib/python2.7/site-packages/Keras-1.0.1-py2.7.egg/keras/engine/training.pyc in predict_on_batch(self, x)
1205 else:
1206 ins = x
-> 1207 self._make_predict_function()
1208 outputs = self.predict_function(ins)
1209 if len(outputs) == 1:
/home/chentingpc/anaconda/lib/python2.7/site-packages/Keras-1.0.1-py2.7.egg/keras/engine/training.pyc in _make_predict_function(self)
687 self.outputs,
688 updates=self.state_updates,
--> 689 **self._function_kwargs)
690
691 def _fit_loop(self, f, ins, out_labels=[], batch_size=32,
/home/chentingpc/anaconda/lib/python2.7/site-packages/Keras-1.0.1-py2.7.egg/keras/backend/theano_backend.pyc in function(inputs, outputs, updates, **kwargs)
507 msg = "Invalid argument '%s' passed to K.function" % key
508 raise ValueError(msg)
--> 509 return Function(inputs, outputs, updates=updates, **kwargs)
510
511
/home/chentingpc/anaconda/lib/python2.7/site-packages/Keras-1.0.1-py2.7.egg/keras/backend/theano_backend.pyc in __init__(self, inputs, outputs, updates, **kwargs)
493 allow_input_downcast=True,
494 on_unused_input='warn',
--> 495 **kwargs)
496
497 def __call__(self, inputs):
/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/compile/function.pyc in function(inputs, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input)
318 on_unused_input=on_unused_input,
319 profile=profile,
--> 320 output_keys=output_keys)
321 # We need to add the flag check_aliased inputs if we have any mutable or
322 # borrowed used defined inputs
/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/compile/pfunc.pyc in pfunc(params, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input, output_keys)
477 accept_inplace=accept_inplace, name=name,
478 profile=profile, on_unused_input=on_unused_input,
--> 479 output_keys=output_keys)
480
481
/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/compile/function_module.pyc in orig_function(inputs, outputs, mode, accept_inplace, name, profile, on_unused_input, output_keys)
1774 profile=profile,
1775 on_unused_input=on_unused_input,
-> 1776 output_keys=output_keys).create(
1777 defaults)
1778
/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/compile/function_module.pyc in __init__(self, inputs, outputs, mode, accept_inplace, function_builder, profile, on_unused_input, fgraph, output_keys)
1426 # OUTPUT VARIABLES)
1427 fgraph, additional_outputs = std_fgraph(inputs, outputs,
-> 1428 accept_inplace)
1429 fgraph.profile = profile
1430 else:
/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/compile/function_module.pyc in std_fgraph(input_specs, output_specs, accept_inplace)
175
176 fgraph = gof.fg.FunctionGraph(orig_inputs, orig_outputs,
--> 177 update_mapping=update_mapping)
178
179 for node in fgraph.apply_nodes:
/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/gof/fg.pyc in __init__(self, inputs, outputs, features, clone, update_mapping)
169
170 for output in outputs:
--> 171 self.__import_r__(output, reason="init")
172 for i, output in enumerate(outputs):
173 output.clients.append(('output', i))
/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/gof/fg.pyc in __import_r__(self, variable, reason)
358 # Imports the owners of the variables
359 if variable.owner and variable.owner not in self.apply_nodes:
--> 360 self.__import__(variable.owner, reason=reason)
361 if (variable.owner is None and
362 not isinstance(variable, graph.Constant) and
/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/gof/fg.pyc in __import__(self, apply_node, check, reason)
472 "for more information on this error."
473 % str(node)),
--> 474 r)
475
476 for node in new_nodes:
MissingInputError: ("An input of the graph, used to compute DimShuffle{x}(keras_learning_phase), was not provided and not given a value.Use the Theano flag exception_verbosity='high',for more information on this error.", keras_learning_phase)
Good thing that the post right about yours explains what you need to do.
I am not sure if it is the same problem. As K.learning_phase() used in Embedding layer works just just fine with no additional input. Why does it need to be added when using user defined layer/function?
For a user defined layer that uses K.in_train_phase, it has to set self.uses_learning_phase = True, so train_on_batch/predict_on_batch and so on can set it correctly. I was following this guide, and didn't notice this. It could have been better if in the guide/doc, setting self.uses_learning_phase = True was explicated mentioned.
Hi,guys! @kgrm @LouisMartin @chentingpc @fchollet @bmabey Would you help me to solve this problem when I was getting the feature of Flatten layer.https://github.com/fchollet/keras/issues/431 Any help will be much appreciate!
Thanks a lot.
when I use this blog code How convolutional neural networks see the world visual my own model's layer: conv_1, conv_2, conv_3 ...., I changed the code:
iterate = K.function([input_img], [loss, grads]) loss_value, grads_value = iterate([input_img_data])
to
iterate = K.function([input_img, K.learning_phase()], [loss, grads]) loss_value, grads_value = iterate([input_img_data, 1])
it worked fine.
Just as the FAQ: Regularization mechanisms, such as Dropout and L1/L2 weight regularization, are turned off at testing time.
Using K.function([input_img, K.learning_phase()], [loss])
gives me an error
TypeError: Unknown parameter type: <type 'int'>
How can I solve this?
I am getting all zeros for pooled_grads_value for some images. So I followed @fchollet suggestion of adding "K.learning_phase()" and set scalar value of zero for it. But still The entire (512,) array of pooled_grads_value is zero for some of sample images.
last_conv_layer = model.get_layer('conv2d_13')
grads = K.gradients(sample_output, last_conv_layer.output)[0]
#grads = normalize_grad(grads)
pooled_grads = K.mean(grads, axis=(0, 2, 3))
iterate = K.function([model.input, K.learning_phase()], [pooled_grads, last_conv_layer.output[0]])
pooled_grads_value, conv_layer_output_value = iterate([x, 0])
I do appreciate if you can help me to resolve the issue.
I've minimally modified the conv_filter_visualization.py example to run on a network I've trained myself ( see https://gist.github.com/kgrm/67555890a3e07cab709a7a81cc487c31 ). The original script (with the provided weights file) works fine, However, on trying to run my modified one, I get the following theano error:
Please make sure that the boxes below are checked before you submit your issue. Thank you!