Open myfrannie opened 4 years ago
When you mentioned "it didn't work", does it mean a) the program crashes or 2) the results look worse?
@junyanz I am wondering if combining Resnet-based with UNet makes sense? Specifically, adding the connection between down/up sampling layers.
Sure. It might be worth trying.
@junyanz, As you mentioned , "For CycleGAN, Resnet-based generators often work much better than UNet.", is there any specific reason because i tried with UNet and burnt my gpu for 200 epochs to find no good result.
I hypothesize that the ResNet has fewer parameters and fewer downsampling, which are both good for color and style transfer, while U-Net has many more parameters (hard to learn these parameters without paired data) and has lots of downsampling layers. But this is just my speculation.
I trained with: ## Traing Model with Generator_unet256 and Discriminator PatchGaN train_SPSCDvsALSHD_random_Gunet_256_DPatchGAN: nohup $(PYTHON_INTERPRETER) ../train.py --dataroot ../datasets/SPSCDvsALSHD_random --name SPSCDvsALSHD_random_Gunet_256_DPatchGAN --model cycle_gan --gpu_ids 2 --netG unet_256 --gan_mode vanilla --pool_size 50 --batch_size 1 --checkpoints_dir ../checkpoints --display_id -1 --preprocess scale_width_and_crop --load_size 1920 --crop_size 512 --save_epoch_freq 20> ./checkpoints/SPSCDvsALSHD_random_Gunet_256_DPatchGAN/SPSCDvsALSHD_random_Gunet_256_DPatchGAN.log 2>&1 &
And I test with: ## Testinng Model with Generator_resnet9B and Discriminator PatchGaN test_SPSCDvsALSHD_random_Gunet_256_DPatchGAN: $(PYTHON_INTERPRETER) ../test.py --dataroot ../datasets/SPSCDvsALSHD_random/testB --name SPSCDvsALSHD_random_Gunet_256_DPatchGAN --model test --no_dropout --checkpoints_dir ../checkpoints --preprocess none --load_size 1920 --gpu_ids 3
I got the following error. Sorry I'm new to this area and I could not figure out what kind parameters I missed for testing, could you please let me know?
Traceback (most recent call last):
File "../test.py", line 52, in
Hello, thanks for providing your code. I applied cycleGAN to my own dataset whose size is 900x900. First, I used the code with default setting and it worked well. (of course train with option
--preprocess scale_width_and_crop --load_size 900 --crop_size 256
and test with--preprocess scale_width --load_size 900
) After then, I tried to train and test the dataset with unet generator with same options. However, it didn't work, and I think because the images size should be 256x256 for unet. Is there any way to use large images with unet generator except cropping the dataset?