Open y-h-Lin opened 10 months ago
Hi @y-h-Lin,
However, the final images are all white. I was wondering if you have any recommended approaches for handling this?
I think, the following steps would be a good starting point to investigate this issue:
Check if uvcgan2
code loads images correctly. If you translate images with translate_images.py
, then it will save the untranslated input images in real_a
and real_b
directories, next to fake_a
and fake_b
. It may be worth looking over the real_a
and real_b
images to ensure that they are correct (not all white). If they are not, then there is probably a bug somewhere in the image loader code, and we will need to fix that.
If uvcgan2
loads images correctly, but the translations looks all white, it is possible that the training has diverged. We can look at the training losses to check if that happened. Could you please post the last line of the history.csv
file (should be saved in the model directory) here -- it will inform us whether the training has diverged or not?
Thank you for your reply.
Regarding point 1, I believe the image loader is functioning properly. The _reala and _realb directories display normal images, however, the other four directories (namely _fakea, _fakeb, _recoa, _recob’) are showing white images.
Below is the final line from my history.csv:
gen_ab | gen_ba | cycle_a | cycle_b | disc_a | disc_b | idt_a | idt_b | gp_a | gp_b -- | -- | -- | -- | -- | -- | -- | -- | -- | -- 0.998819 | 0.998798 | 1.414826 | 1.512557 | 0.000296 | 0.000291 | 0.707413 | 0.756278 | 0.000256 | 0.000243
Hi, I am trying to use UVCGANv2 for grayscale image (8-bit) conversion (both domains are grayscale images). My current approach is to convert 8-bit grayscale images into RGB images and then train the models. However, the final images are all white. I was wondering if you have any recommended approaches for handling this? Thank you!