facebookresearch / kill-the-bits

Code for: "And the bit goes down: Revisiting the quantization of neural networks"
Other
636 stars 124 forks source link

Why the activation is the input? #35

Closed yw155 closed 4 years ago

yw155 commented 4 years ago

Hi @pierrestock, I would like to ask you a question about the hook,

def _register_hooks(self):
    # define hook to save output after each layer
    def fwd_hook(module, input, output):
        layer = self.modules_to_layers[module]
        if self._watch:
            # retrieve activations
            activations = input[0].data.cpu()
            # store activations
            self.activations[layer].append(activations)

why the activation is the value of input but not the output here? Thanks in advance.

pierrestock commented 4 years ago

Hi yw155,

Thanks for your interest. This is the way forward hooks are implemented in PyTorch, see documentation here.

Hope this helps,

Pierre