jerryli27 / TwinGAN

Twin-GAN -- Unpaired Cross-Domain Image Translation with Weight-Sharing GANs
Apache License 2.0
719 stars 98 forks source link

The result of my own training, using the image_translation_infer.py test, looks very bad. #16

Open c1a1o1 opened 5 years ago

c1a1o1 commented 5 years ago

The result of my own training, using the image_translation_infer.py test, looks very bad.

jerryli27 commented 5 years ago

It'd be helpful if you can share the dataset used, the script for training, and the results here. Thanks!

c1a1o1 commented 5 years ago

@jerryli27
For celeba to cat, My train parameter:

--program_name=twingan --dataset_name="ren" --dataset_dir="datasets/ren2cat/ren/tfrecord/" --unpaired_target_dataset_name="cat" --unpaired_target_dataset_dir="datasets/ren2cat/cat/tfrecord/" --train_dir="./checkpoints/rencat/" --dataset_split_name=train --preprocessing_name="danbooru" --resize_mode=RESHAPE --do_random_cropping=True --learning_rate=0.0001 --learning_rate_decay_type=fixed --is_training=True --generator_network="pggan" --use_unet=True --num_images_per_resolution=300000 --loss_architecture=dragan --gradient_penalty_lambda=0.25 --pggan_max_num_channels=256 --generator_norm_type=batch_renorm --hw_to_batch_size="{4: 8, 8: 8, 16: 8, 32: 8, 64: 8, 128: 4, 256: 3, 512: 2}"

My test parameter:

--model_path="../checkpoint/rencat/256/" --image_hw=256 --input_tensor_name="sources_ph" --output_tensor_name="custom_generated_t_style_source:0" --input_image_path="../demo/face/var256/" --output_image_path="../demo/face/cat256/"

Is there some problem?

Thank you very much!

jerryli27 commented 5 years ago

Please try to add --do_pixel_norm=True to the training script. That stabilizes things a bit. Another tip is that the image output during training should look well at 32x32 or 64x64. If not, you don't need to train it all the way to 256. That'll be a waste of time.

And if you want, sharing the image and the tensorboard output would help a lot for debugging Let me know how it goes.

c1a1o1 commented 5 years ago

Thank you very much!