iamhankai / ghostnet.pytorch

[CVPR2020] GhostNet: More Features from Cheap Operations
https://arxiv.org/abs/1911.11907
522 stars 116 forks source link

Custom Weight Initialization #24

Open glenn-jocher opened 4 years ago

glenn-jocher commented 4 years ago

I noticed you use code for custom weight initialization: https://github.com/iamhankai/ghostnet.pytorch/blob/2c90e67d8c33c44ec1bad12c9686f645b0d4de08/ghost_net.py#L162-L169

I've not seen this before. Is there a reason behind this specific strategy? Do you know the effect this has on the training, and have you compared this with the pytorch default weight initialization? Thank you!

iamhankai commented 4 years ago

kaiming_normal_ is a commonly used initialization strategy.

glenn-jocher commented 4 years ago

@iamhankai thank you! Do you know what the default pytorch weights init strategy is?

I suppose this makes for easier comparisons with the TF version of ghostnet to use the same strategy on both?

iamhankai commented 4 years ago

@glenn-jocher TF version of ghostnet also used Kaiming normal initialization.