microsoft / tensorwatch

Debugging, monitoring and visualization for Python Machine Learning and Data Science
MIT License
3.41k stars 362 forks source link

RuntimeError: you can only change requires_grad flags of leaf variables. #25

Open Tramac opened 5 years ago

Tramac commented 5 years ago

Hi, thanks for the great work!

When i set results = saliency.get_image_saliency_results(model, img, input_tensor, prediction_tensor), methods=['gradcam', 'smooth_grad'] in cnn_pred_explain.ipynb , Error happend:

Traceback (most recent call last):
  File "debug.py", line 18, in <module>
    results = saliency.get_image_saliency_results(model, img, input_tensor, prediction_tensor, methods=['gradcam', 'smooth_grad'])
  File "Anaconda3\lib\site-packages\tensorwatch\saliency\saliency.py", line 93, in get_image_saliency_results
    sal = get_saliency(model, raw_image, input, label, method=method)
  File "Anaconda3\lib\site-packages\tensorwatch\saliency\saliency.py", line 76, in get_saliency
    saliency = exp.explain(input, label, raw_input)
  File "Anaconda3\lib\site-packages\tensorwatch\saliency\backprop.py", line 146, in explain
    grad = self.base_explainer.explain(noisy_inp, ind)
  File "Anaconda3\lib\site-packages\tensorwatch\saliency\backprop.py", line 30, in explain
    return self._backprop(inp, ind)
  File "Anaconda3\lib\site-packages\tensorwatch\saliency\backprop.py", line 12, in _backprop
    inp.requires_grad = True
RuntimeError: you can only change requires_grad flags of leaf variables.

What's wrong with this?

sytelus commented 5 years ago

I've seen this error with another model. We will investigate this issue. I'll keep it open for future updates.

skynet1010 commented 4 years ago

Hello, figured out the same issue, by using the example notebook (cnn_pred_explain) in the git repository. Is there a work around? If I try adjusting the files to inp.clone().detach().requiresgrad(True) I get another Error --> seems not to work: RuntimeError: set_sizes_and_strides is not allowed on a Tensor created from .data or .detach(). If your intent is to change the metadata of a Tensor (such as sizes / strides / storage / storage_offset) without autograd tracking the change, remove the .data / .detach() call and wrap the change in a with torch.no_grad(): block. For example, change: x.data.set_(y) to: with torch.nograd(): x.set(y)