soumith / inception.torch

Torch port of https://github.com/google/inception
Other
66 stars 20 forks source link

Question on normalization in example.lua #4

Open felixsmueller opened 8 years ago

felixsmueller commented 8 years ago

Hi

sorry for bothering. In the example.lua I saw the following: img:mul(255):clamp(0, 255):add(-117) The mul(255) is to blow up the values from the range 0...1 to 0...255. The add(-117) is to remove the mean of all the imagenet images I suppose. I noticed that you do not divide by the std-deviation. Is this just a simplification for this example or not needed in general? If we should do the normalization, what value do you suggest to take (std_dev over all imagenet images)?

Regards, Felix

soumith commented 8 years ago

@felixsmueller i do not divide by the std-deviation because that's what the google network seemed to do in training. Usually for all my trained networks, I normalize to 0-mean and stdv-1

felixsmueller commented 8 years ago

Thanks a lot.