Xilinx / brevitas

Brevitas: neural network quantization in PyTorch
https://xilinx.github.io/brevitas/
Other
1.15k stars 191 forks source link

Modify layers in a pretrained Brevitas Model #1008

Open abedbaltaji opened 2 weeks ago

abedbaltaji commented 2 weeks ago

How to modify or add layers to a pretrained Brevitas model?

I am working with the CNV pretrained model from Brevitas to use it with FINN. My goal is to modify a layer or add a custom one. For example, I want to replace the QuantReLU with my custom MyQuantReLU or substitute a QuantMaxPool with a layer that calculates a minpool instead.

Could you advise on the best approach to achieve this?

Looking forward to your support.

nickfraser commented 2 weeks ago

There's a few ways to do this:

  1. (easiest): Rewrite the CNV model to use your custom layers - if the state_dict's are compatible, you can simply load the state_dict from the pretrained version into your custom version. If they're not compatible, you might need to write a custom _load_state_dict method for your custom module.
  2. (hard): Use our module-to-module modification infrastructure to convert the CNV modules of interest into the modules you want. This can be tricky because our tools try to reconstruct the kwargs used to instantiate the original model from its attributes. This can quickly turn into quite the rabbit hole - I wouldn't recommend it for a small model like CNV.

If you choose to do option 2, we will only provide limited support as this is for advanced users.

Note, we do not have a QuantMaxPool or QuantReLU in CNV - are you sure you're looking at the right network?

Out of curiosity, is there something that you need in your custom QuantReLU that isn't provided in the regular one? If so, we may consider adding it - please let us know!