carpedm20 / simulated-unsupervised-tensorflow

TensorFlow implementation of "Learning from Simulated and Unsupervised Images through Adversarial Training"
Apache License 2.0
575 stars 150 forks source link

Curious about the constant used in the normalize function #22

Open daeilkim opened 7 years ago

daeilkim commented 7 years ago

In layers.py, there is a normalize function that has a constant of 127.5:

def normalize(layer): return layer/127.5 - 1.

I'm a little confused as to where the 127.5 comes from. It's a very specific question of course, but i'm interested in extending the regularization loss function with other types of transforms outside of the identity mapping used in the paper. If you have any tips or pointers in modifying that I'd love to hear. Great work and thanks for doing this!

alex-mocanu commented 7 years ago

It looks like the purpose of this normalization is to bring the values in range [-1.0,1.0]. As values for a pixel in grayscale are in range [0,255], we needed to divide by 127.5, to bring 255 to 2.0.

daeilkim commented 7 years ago

So in the code's current form, would it not support RGB images?

alex-mocanu commented 7 years ago

Looking over the code, I could tell that it admits images with however many channels (only GPU memory limits you). You have to set the "input_channel" command line parameter to 3 and you should change the output layer of the refiner to output "self.input_channel" channels instead of 1.