keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
61.71k stars 19.43k forks source link

Theano error on trying to adapt the visualization example #2417

Closed kgrm closed 8 years ago

kgrm commented 8 years ago

I've minimally modified the conv_filter_visualization.py example to run on a network I've trained myself ( see https://gist.github.com/kgrm/67555890a3e07cab709a7a81cc487c31 ). The original script (with the provided weights file) works fine, However, on trying to run my modified one, I get the following theano error:

$ python alexnet_visualization.py 
Using Theano backend.
Using gpu device 0: GeForce GTX TITAN X (CNMeM is enabled with initial size: 95.0% of memory, cuDNN 5004)
Weights loaded.
Model loaded.
convolution2d_1
maxpooling2d_1
batchnormalization_1
zeropadding2d_1
convolution2d_2
maxpooling2d_2
batchnormalization_2
zeropadding2d_2
convolution2d_3
zeropadding2d_3
convolution2d_4
zeropadding2d_4
convolution2d_5
maxpooling2d_3
flatten_1
dense_1
dropout_1
dense_2
dropout_2
dense_3
Processing filter 0
Traceback (most recent call last):
  File "alexnet_visualization.py", line 116, in <module>
    iterate = K.function([input_img], [loss, grads])
  File "/usr/local/lib/python2.7/dist-packages/keras/backend/theano_backend.py", line 509, in function
    return Function(inputs, outputs, updates=updates, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/keras/backend/theano_backend.py", line 495, in __init__
    **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/theano/compile/function.py", line 322, in function
    output_keys=output_keys)
  File "/usr/local/lib/python2.7/dist-packages/theano/compile/pfunc.py", line 480, in pfunc
    output_keys=output_keys)
  File "/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", line 1817, in orig_function
    output_keys=output_keys).create(
  File "/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", line 1469, in __init__
    accept_inplace)
  File "/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", line 177, in std_fgraph
    update_mapping=update_mapping)
  File "/usr/local/lib/python2.7/dist-packages/theano/gof/fg.py", line 182, in __init__
self.__import_r__(output, reason="init")
  File "/usr/local/lib/python2.7/dist-packages/theano/gof/fg.py", line 371, in __import_r__
    self.__import__(variable.owner, reason=reason)
  File "/usr/local/lib/python2.7/dist-packages/theano/gof/fg.py", line 413, in __import__
    variable=r)
theano.gof.fg.MissingInputError: An input of the graph, used to compute DimShuffle{x,x,x,x}(keras_learning_phase), was not provided and not given a value.Use the Theano flag exception_verbosity='high',for more information on this error.

Backtrace when the variable is created:
  File "alexnet_visualization.py", line 7, in <module>
    from keras.models import Sequential
  File "/usr/local/lib/python2.7/dist-packages/keras/models.py", line 5, in <module>
    from . import backend as K
  File "/usr/local/lib/python2.7/dist-packages/keras/backend/__init__.py", line 51, in <module>
    from .theano_backend import *
  File "/usr/local/lib/python2.7/dist-packages/keras/backend/theano_backend.py", line 13, in <module>
    _LEARNING_PHASE = T.scalar(dtype='uint8', name='keras_learning_phase')  # 0 = test, 1 = train

Please make sure that the boxes below are checked before you submit your issue. Thank you!

louismartin commented 8 years ago

Up Same error for me!

fchollet commented 8 years ago

Your model apparently has a different behavior in training and test mode, and so needs to know what mode it should be using.

Use

iterate = K.function([input_img, K.learning_phase()], [loss, grads])

and pass 1 or 0 as value for the learning phase, based on whether you want the model in training mode or test mode.

chentingpc commented 8 years ago

I also have similar problem, but occurs when I run use fit() or train_on_batch(), I used K.in_train_phase(pos_score, neg_score) in a user defined layer, pos_score and neg_score are both (symbolic) computed form layer input x. Even if I use K.in_train_phase(pos_score, neg_score), it shows the same error:


MissingInputError                         Traceback (most recent call last)
<ipython-input-33-09ce76894494> in <module>()
----> 1 model.predict_on_batch([src, dst])

/home/chentingpc/anaconda/lib/python2.7/site-packages/Keras-1.0.1-py2.7.egg/keras/engine/training.pyc in predict_on_batch(self, x)
   1205         else:
   1206             ins = x
-> 1207         self._make_predict_function()
   1208         outputs = self.predict_function(ins)
   1209         if len(outputs) == 1:

/home/chentingpc/anaconda/lib/python2.7/site-packages/Keras-1.0.1-py2.7.egg/keras/engine/training.pyc in _make_predict_function(self)
    687                                                self.outputs,
    688                                                updates=self.state_updates,
--> 689                                                **self._function_kwargs)
    690 
    691     def _fit_loop(self, f, ins, out_labels=[], batch_size=32,

/home/chentingpc/anaconda/lib/python2.7/site-packages/Keras-1.0.1-py2.7.egg/keras/backend/theano_backend.pyc in function(inputs, outputs, updates, **kwargs)
    507                 msg = "Invalid argument '%s' passed to K.function" % key
    508                 raise ValueError(msg)
--> 509     return Function(inputs, outputs, updates=updates, **kwargs)
    510 
    511 

/home/chentingpc/anaconda/lib/python2.7/site-packages/Keras-1.0.1-py2.7.egg/keras/backend/theano_backend.pyc in __init__(self, inputs, outputs, updates, **kwargs)
    493                                         allow_input_downcast=True,
    494                                         on_unused_input='warn',
--> 495                                         **kwargs)
    496 
    497     def __call__(self, inputs):

/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/compile/function.pyc in function(inputs, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input)
    318                    on_unused_input=on_unused_input,
    319                    profile=profile,
--> 320                    output_keys=output_keys)
    321     # We need to add the flag check_aliased inputs if we have any mutable or
    322     # borrowed used defined inputs

/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/compile/pfunc.pyc in pfunc(params, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input, output_keys)
    477                          accept_inplace=accept_inplace, name=name,
    478                          profile=profile, on_unused_input=on_unused_input,
--> 479                          output_keys=output_keys)
    480 
    481 

/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/compile/function_module.pyc in orig_function(inputs, outputs, mode, accept_inplace, name, profile, on_unused_input, output_keys)
   1774                    profile=profile,
   1775                    on_unused_input=on_unused_input,
-> 1776                    output_keys=output_keys).create(
   1777             defaults)
   1778 

/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/compile/function_module.pyc in __init__(self, inputs, outputs, mode, accept_inplace, function_builder, profile, on_unused_input, fgraph, output_keys)
   1426             # OUTPUT VARIABLES)
   1427             fgraph, additional_outputs = std_fgraph(inputs, outputs,
-> 1428                                                     accept_inplace)
   1429             fgraph.profile = profile
   1430         else:

/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/compile/function_module.pyc in std_fgraph(input_specs, output_specs, accept_inplace)
    175 
    176     fgraph = gof.fg.FunctionGraph(orig_inputs, orig_outputs,
--> 177                                   update_mapping=update_mapping)
    178 
    179     for node in fgraph.apply_nodes:

/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/gof/fg.pyc in __init__(self, inputs, outputs, features, clone, update_mapping)
    169 
    170         for output in outputs:
--> 171             self.__import_r__(output, reason="init")
    172         for i, output in enumerate(outputs):
    173             output.clients.append(('output', i))

/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/gof/fg.pyc in __import_r__(self, variable, reason)
    358         # Imports the owners of the variables
    359         if variable.owner and variable.owner not in self.apply_nodes:
--> 360                 self.__import__(variable.owner, reason=reason)
    361         if (variable.owner is None and
    362                 not isinstance(variable, graph.Constant) and

/home/chentingpc/anaconda/lib/python2.7/site-packages/theano/gof/fg.pyc in __import__(self, apply_node, check, reason)
    472                             "for more information on this error."
    473                             % str(node)),
--> 474                             r)
    475 
    476         for node in new_nodes:

MissingInputError: ("An input of the graph, used to compute DimShuffle{x}(keras_learning_phase), was not provided and not given a value.Use the Theano flag exception_verbosity='high',for more information on this error.", keras_learning_phase)
fchollet commented 8 years ago

Good thing that the post right about yours explains what you need to do.

chentingpc commented 8 years ago

I am not sure if it is the same problem. As K.learning_phase() used in Embedding layer works just just fine with no additional input. Why does it need to be added when using user defined layer/function?

chentingpc commented 8 years ago

For a user defined layer that uses K.in_train_phase, it has to set self.uses_learning_phase = True, so train_on_batch/predict_on_batch and so on can set it correctly. I was following this guide, and didn't notice this. It could have been better if in the guide/doc, setting self.uses_learning_phase = True was explicated mentioned.

tangjie77wd commented 8 years ago

Hi,guys! @kgrm @LouisMartin @chentingpc @fchollet @bmabey Would you help me to solve this problem when I was getting the feature of Flatten layer.https://github.com/fchollet/keras/issues/431 Any help will be much appreciate!

absentm commented 7 years ago

Thanks a lot.

when I use this blog code How convolutional neural networks see the world visual my own model's layer: conv_1, conv_2, conv_3 ...., I changed the code:

iterate = K.function([input_img], [loss, grads]) loss_value, grads_value = iterate([input_img_data])

to

iterate = K.function([input_img, K.learning_phase()], [loss, grads]) loss_value, grads_value = iterate([input_img_data, 1])

it worked fine.

Just as the FAQ: Regularization mechanisms, such as Dropout and L1/L2 weight regularization, are turned off at testing time.

jrkager commented 7 years ago

Using K.function([input_img, K.learning_phase()], [loss]) gives me an error

TypeError: Unknown parameter type: <type 'int'>

How can I solve this?

HRKpython commented 5 years ago

I am getting all zeros for pooled_grads_value for some images. So I followed @fchollet suggestion of adding "K.learning_phase()" and set scalar value of zero for it. But still The entire (512,) array of pooled_grads_value is zero for some of sample images.

last_conv_layer = model.get_layer('conv2d_13')
grads = K.gradients(sample_output, last_conv_layer.output)[0]

#grads = normalize_grad(grads)

pooled_grads = K.mean(grads, axis=(0, 2, 3))
iterate = K.function([model.input, K.learning_phase()], [pooled_grads, last_conv_layer.output[0]])

pooled_grads_value, conv_layer_output_value = iterate([x, 0])

I do appreciate if you can help me to resolve the issue.