-
in USConv2d
self.width_mult = None
then in forward function:
self.in_channels = make_divisible(self.in_channels_max * self.width_mult/ self.ratio[0]) * self.ratio[0]
int * None …
-
in us.mobilenet_v2.py
self.block_setting = [
# t, c, n, s
[1, 16, 1, 1],
[6, 24, 2, 2],
[6, 32, 3, 2],
[6, 64, 4, 2],
[6,…
-
Hi, thank you for your great works. :)
I have a question about arbitrary width in a universally silmmable network.
In your paper, you mentioned that you sample a random width ratio for each sub-netw…
-
` if FLAGS.optimizer == 'sgd':
# all depthwise convolution (N, 1, x, x) has no weight decay
# weight decay only on normal conv and fc
model_params = []
for name,…
-
Hi
Thank you for release code. It is very useful.
I'm trying to reproduce performance of USNet.
As I know in USNet paper, you adopt n=4 for training which means num_sample_training is 4
But, in…
bhheo updated
4 years ago
-
Exciting! I have studied your papers seriously. I am very interested in your work and looking forward to your new code for the paper《AutoSlim...》.
Hope you can release it soon, please.
-
Hello, I really like this work of network pruning.
I have a little problem about the BN's parameters during searching.
Take MobileNetV1 as an example, while training the PruningNet, BN's running…
-
First of all, thank you for your wonderful work!
I have a question regarding the `USBatchNorm` operation.
I thought that sharing batchnorm between different channel widths was the main reason fo…
-
Hi Jiahui, thanks for the open-source of your work. I have some questions about some details in universally slimmable networks.
1. I didn't quite understand the post-statistics BN. Could you please e…
-
Hi Jiahui, I was trying to reproduce the USNet, but I encountered some issues in computing post BN statistics. After traning(actually I use your released model), I first randomly assign a width, then …