katywarr / strengthening-dnns

Code repository to accompany the O'Reilly book: "Strengthening Deep Neural Networks: Making AI Less Susceptible to Adversarial Trickery"
https://learning.oreilly.com/library/view/strengthening-deep-neural/9781492044949/
MIT License
52 stars 22 forks source link

Bug: Chapter5 (Saliency) - InvalidArgumentError: <tensor> is both fed and fetched #2

Closed katywarr closed 4 years ago

katywarr commented 4 years ago

This bug occurs when Keras-vis visualize_saliency is called in both:

chapter5\fashionMNIST_vis_saliency 
chapter5\resnet50_vis_saliency

For example:

---------------------------------------------------------------------------
InvalidArgumentError                      Traceback (most recent call last)
<ipython-input-6-39c4a59cd429> in <module>
     16                            layer_idx,
     17                            filter_indices=label,
---> 18                            seed_input=image)
     19 
     20 print(grads.shape)

~\Anaconda3\envs\strengthening-dnns\lib\site-packages\vis\visualization\saliency.py in visualize_saliency(model, layer_idx, filter_indices, seed_input, backprop_modifier, grad_modifier)
    123         (ActivationMaximization(model.layers[layer_idx], filter_indices), -1)
    124     ]
--> 125     return visualize_saliency_with_losses(model.input, losses, seed_input, grad_modifier)
    126 
    127 

~\Anaconda3\envs\strengthening-dnns\lib\site-packages\vis\visualization\saliency.py in visualize_saliency_with_losses(input_tensor, losses, seed_input, grad_modifier)
     71     """
     72     opt = Optimizer(input_tensor, losses, norm_grads=False)
---> 73     grads = opt.minimize(seed_input=seed_input, max_iter=1, grad_modifier=grad_modifier, verbose=False)[1]
     74 
     75     channel_idx = 1 if K.image_data_format() == 'channels_first' else -1

~\Anaconda3\envs\strengthening-dnns\lib\site-packages\vis\optimizer.py in minimize(self, seed_input, max_iter, input_modifiers, grad_modifier, callbacks, verbose)
    141 
    142             # 0 learning phase for 'test'
--> 143             computed_values = self.compute_fn([seed_input, 0])
    144             losses = computed_values[:len(self.loss_names)]
    145             named_losses = zip(self.loss_names, losses)

~\Anaconda3\envs\strengthening-dnns\lib\site-packages\keras\backend\tensorflow_backend.py in __call__(self, inputs)
   2713                 return self._legacy_call(inputs)
   2714 
-> 2715             return self._call(inputs)
   2716         else:
   2717             if py_any(is_tensor(x) for x in inputs):

~\Anaconda3\envs\strengthening-dnns\lib\site-packages\keras\backend\tensorflow_backend.py in _call(self, inputs)
   2669                                 feed_symbols,
   2670                                 symbol_vals,
-> 2671                                 session)
   2672         if self.run_metadata:
   2673             fetched = self._callable_fn(*array_vals, run_metadata=self.run_metadata)

~\Anaconda3\envs\strengthening-dnns\lib\site-packages\keras\backend\tensorflow_backend.py in _make_callable(self, feed_arrays, feed_symbols, symbol_vals, session)
   2621             callable_opts.run_options.CopyFrom(self.run_options)
   2622         # Create callable.
-> 2623         callable_fn = session._make_callable_from_options(callable_opts)
   2624         # Cache parameters corresponding to the generated callable, so that
   2625         # we can detect future mismatches and refresh the callable.

~\Anaconda3\envs\strengthening-dnns\lib\site-packages\tensorflow\python\client\session.py in _make_callable_from_options(self, callable_options)
   1469     """
   1470     self._extend_graph()
-> 1471     return BaseSession._Callable(self, callable_options)
   1472 
   1473 

~\Anaconda3\envs\strengthening-dnns\lib\site-packages\tensorflow\python\client\session.py in __init__(self, session, callable_options)
   1423         with errors.raise_exception_on_not_ok_status() as status:
   1424           self._handle = tf_session.TF_SessionMakeCallable(
-> 1425               session._session, options_ptr, status)
   1426       finally:
   1427         tf_session.TF_DeleteBuffer(options_ptr)

~\Anaconda3\envs\strengthening-dnns\lib\site-packages\tensorflow\python\framework\errors_impl.py in __exit__(self, type_arg, value_arg, traceback_arg)
    526             None, None,
    527             compat.as_text(c_api.TF_Message(self.status.status)),
--> 528             c_api.TF_GetCode(self.status.status))
    529     # Delete the underlying status object from memory otherwise it stays alive
    530     # as there is a reference to status from this from the traceback due to

InvalidArgumentError: flatten_input_1:0 is both fed and fetched.
JafarBadour commented 4 years ago

I had the same issue, still dont know how to solve.

katywarr commented 4 years ago

Thanks @JafarBadour, I'll take a look at it in the next few days.

katywarr commented 4 years ago

The problem is fixed in a later version of keras-vis (1.5) which has not been released to PyPi and is described here: keras-vis/issues/158. The quick fix is to upgrade the version using and to re-run the existing notebooks:

pip install git+https://github.com/raghakot/keras-vis.git -U

The LIME library also provides a nice (different) approach to displaying the explainability behind the classifications. This wasn't included in the book. I am rewriting the Jupyter saliency notebooks to include this approach.

The new notebooks will require the following dependencies:

For LIME:

conda install lime
conda instal scikit-learn

For keras-vis:

pip install git+https://github.com/raghakot/keras-vis.git -U
katywarr commented 4 years ago

To do:

katywarr commented 4 years ago

Unfortunately, there's no easy way to update the yml with the required pip install. I have added a comment to the FashionMNIST Jupyter notebook to address this.

Both updated notebooks were fixed in the commit here.