automl / NASLib

NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
Apache License 2.0
528 stars 117 forks source link

Provide option that diasble in-place ReLu that allows use pytorch 'register_full_backward_hook' #113

Closed lichuanx closed 2 years ago

lichuanx commented 2 years ago

Hi, dear organizer: I come across some issues when coding on submission, it seems like we don't have an option for re-define the network, which means in-place ReLU will break the register_full_backward_hook by rising errors like:

RuntimeError: Output 0 of BackwardHookFunctionBackward is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is forbidden. You can remove this warning by cloning the output of the custom Function.