bayesiains / nflows

Normalizing flows in PyTorch
MIT License
845 stars 118 forks source link

Gradients of log probabilities with respect to the inputs? #31

Closed ufimtsev closed 3 years ago

ufimtsev commented 3 years ago

Dear All,

I wanted to compute gradients of log prob density w.r.t. the inputs, but apparently the gradient are not propagated to inputs and the result is 'None'. Would you suggest a fix or workaround? Thanks!

`import torch import torch.nn.functional as F from nflows import transforms, distributions, flows

def main(): flow = flows.MaskedAutoregressiveFlow( features=1, hidden_features=16, num_layers=1, num_blocks_per_layer=2, use_residual_blocks=False, use_random_masks=False, use_random_permutations=False, activation=F.elu, dropout_probability=0.0, batch_norm_within_layers=False, batch_norm_between_layers=False, )

x = torch.tensor([[1.]])
p = flow.log_prob(x)
p.backward()
print (x.grad)

if name == 'main': main()`

kadeng commented 3 years ago

torch.tensor has to be constructed with requires_grad=True argument. That should fix it.

ufimtsev commented 3 years ago

Right, thank you!