Open tkng opened 6 years ago
FYI, PerImageStandardization
might be also necessary for MNIST dataset.
About MNIST dataset:
When I used DivideBy255
, an accuracy for test data was very unstable.
But I used PerImageStandardization
, the result for test data was so good (stable and high accuracy).
Let's me conclude what we have to do.
./blueoil.sh init
DivideBy255
to PerImageStandardization
and
-> not apply activation quantization in the first layer.@lm-jira Thanks. There are some little mistake about quantization.
* Allow users to choose to apply activation quantization at first layer of training or not in init command. `./blueoil.sh init`
Users to choose to apply quantization including activation and weight at first layer, not only activation quantization.
If they do not want to apply -> change the preprocessor method from DivideBy255 to PerImageStandardization and -> not apply activation quantization in the first layer.
If they do not want to apply
PerImageStandardization
is still not possible to apply in the blueoil init
command. I think it's better to keep this issue for now.
Currently, we supports only
DivideBy255
preprocessor. However, internally, we've preparedPerImageStandardization
. @yasumura-lm san mentioned that sometimes this preprocessing method is is very effective.Hence, we want to support
PerImageStandardization
.