blue-oil / blueoil

Bring Deep Learning to small devices
https://blueoil.org
Apache License 2.0
249 stars 86 forks source link

add new preprocessing: PerImageStandardization #6

Open tkng opened 6 years ago

tkng commented 6 years ago

Currently, we supports only DivideBy255 preprocessor. However, internally, we've prepared PerImageStandardization. @yasumura-lm san mentioned that sometimes this preprocessing method is is very effective.

Hence, we want to support PerImageStandardization.

tk26eng commented 6 years ago

FYI, PerImageStandardization might be also necessary for MNIST dataset.

About MNIST dataset: When I used DivideBy255, an accuracy for test data was very unstable. But I used PerImageStandardization, the result for test data was so good (stable and high accuracy).

lm-jira commented 6 years ago

Let's me conclude what we have to do.

ruimashita commented 6 years ago

@lm-jira Thanks. There are some little mistake about quantization.

* Allow users to choose to apply activation quantization at first layer of training or not in init command. `./blueoil.sh init`

Users to choose to apply quantization including activation and weight at first layer, not only activation quantization.

If they do not want to apply -> change the preprocessor method from DivideBy255 to PerImageStandardization and -> not apply activation quantization in the first layer.

If they do not want to apply

tkng commented 4 years ago

PerImageStandardization is still not possible to apply in the blueoil init command. I think it's better to keep this issue for now.