luigibonati / mlcolvar

A unified framework for machine learning collective variables for enhanced sampling simulations
MIT License
91 stars 24 forks source link

Dropout does not work #118

Closed geraseva closed 7 months ago

geraseva commented 7 months ago

When trying to use dropout as following: model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, layers=nn_layers, options={'nn':{'dropout': [0.2,0.4,0.6]}})

it raises an error: RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [956547, 4]], which is output 0 of ReluBackward0, is at version 2; expected version 1 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).

As I understand, the reason is inplace=True (https://github.com/luigibonati/mlcolvar/blob/main/mlcolvar/core/nn/feedforward.py#L97).

Could you fix it, please?

geraseva commented 7 months ago

BatchNorm does not work either. I do: model = DeepTDA(n_states=n_states, n_cvs=1,target_centers=target_centers, target_sigmas=target_sigmas, layers=nn_layers, options={'nn':{'batchnorm': True}}) and get: ValueError: expected 2D or 3D input (got 1D input)