Closed azshue closed 5 years ago
Not all (a,b) values are in-gamut. In the original Caffe implementation, I took out the 313 which were in-gamut. For the reimplementation, I just kept all 23*23=529 bins. It makes the encoding code a bit more straightforward. In practice, the other 529-313=216 bins are basically unused, so at test time they should output ~0% probability.
Not all (a,b) values are in-gamut. In the original Caffe implementation, I took out the 313 which were in-gamut. For the reimplementation, I just kept all 23*23=529 bins. It makes the encoding code a bit more straightforward. In practice, the other 529-313=216 bins are basically unused, so at test time they should output ~0% probability.
Thank you so much! This also explains the encoding code in PyTorch implementation.
Hi,
I'm reading the implementation detail of the
SIGGRAPHGenerator
, but had some trouble understanding the classification output of this network. Classification output inSIGGRAPHGenerator
: https://github.com/richzhang/colorization-pytorch/blob/9fd9bd867bca53c861816298089c978617d7d5f5/models/networks.py#L315Is 529 the number of quantized color Q? While I suppose this number to be 313 instead... Could anyone please explain this number?
Thanks,