Open wm901115nwpu opened 4 years ago
I using alexnet_bn_wrpn in the imagenet dataset. But the result in epoch 36 is lower than Dorefa or PACT or fp32. This is my config file:
quantizers: wrpn_quantizer: class: WRPNQuantizer bits_activations: 8 bits_weights: 4 overrides:
features.0: bits_weights: null bits_activations: null features.1: bits_weights: null bits_activations: null classifier.5: bits_weights: null bits_activations: null classifier.6: bits_weights: null bits_activations: null
lr_schedulers: training_lr: class: MultiStepLR milestones: [60, 75] gamma: 0.2
policies:
And I can't find params setting about layer widden. Could you tell me how to do this setting?
I using alexnet_bn_wrpn in the imagenet dataset. But the result in epoch 36 is lower than Dorefa or PACT or fp32. This is my config file:
quantizers: wrpn_quantizer: class: WRPNQuantizer bits_activations: 8 bits_weights: 4 overrides:
Don't quantize first and last layer
lr_schedulers: training_lr: class: MultiStepLR milestones: [60, 75] gamma: 0.2
policies: