donydchen / ganimation_replicate

An Out-of-the-Box Replication of GANimation using PyTorch, pretrained weights are available!
MIT License
247 stars 48 forks source link

test using other dataset #1

Open uniBruce opened 5 years ago

uniBruce commented 5 years ago

Thanks very much for the replicate, as I did not get good results on GANimation. Your work helped a lot. The test code for the pre-trained model uses the validation dataset of celebA as well, right? However, when I set the test dataset path as my own test dataset, I met many problems in base_dataset.py and data_loader.py. Could you check with that? Different problems turned out when I was debuging this, so I could not attach all the problems.

donydchen commented 5 years ago

Hi @uniBruce , the code uses 'batch' settings for testing, so if you want to test on other datasets rather than CelebA or EmotionNet, you'll need to create a specific dataset Class by inheriting the base_dataset.py. Basically, you can just copy celeba.py and modify a few lines of code to adapt to your own dataset. Then call your dataset class in data_loader.py.

Or you can just structure your own dataset as the CelebA I provided, and named it as celeba.

uniBruce commented 5 years ago

Hi @uniBruce , the code uses 'batch' settings for testing, so if you want to test on other datasets rather than CelebA or EmotionNet, you'll need to create a specific dataset Class by inheriting the base_dataset.py. Basically, you can just copy celeba.py and modify a few lines of code to adapt to your own dataset. Then call your dataset class in data_loader.py.

Or you can just structure your own dataset as the CelebA I provided, and named it as celeba.

Thanks, I will try it.

uniBruce commented 5 years ago

Hi, it's me again. I noticed that you made the lambda_mask and lambda_tv both 0, could you explain it? Because these two parameters really brought me much trouble when I replicated the GANimation. The attention mask always saturates to 1. I used your celebA dataset to train GANimation as well, but the expression never changed.

donydchen commented 5 years ago

Hi @uniBruce, from my point of view, these two parameters won't affect too much of the final results, as long as they are not set to some very large values. Training with the code and dataset provided by this project, you should be able to get the similar reasonable results. However, if you implement GANimation by yourself, and the expression never change in the testing status, do check the results in the training status. If the training results turn out well, then something may go wrong with the normalization layer. If you use instance_norm, do remember to set affine=False, track_running_stats=False. See https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix/issues/395 for more details.

uniBruce commented 5 years ago

Hi @uniBruce, from my point of view, these two parameters won't affect too much of the final results, as long as they are not set to some very large values. Training with the code and dataset provided by this project, you should be able to get the similar reasonable results. However, if you implement GANimation by yourself, and the expression never change in the testing status, do check the results in the training status. If the training results turn out well, then something may go wrong with the normalization layer. If you use instance_norm, do remember to set affine=False, track_running_stats=False. See junyanz/pytorch-CycleGAN-and-pix2pix#395 for more details.

Thanks a lot. I' ll check with that.

syrilzhang commented 5 years ago

Hi, it's me again. I noticed that you made the lambda_mask and lambda_tv both 0, could you explain it? Because these two parameters really brought me much trouble when I replicated the GANimation. The attention mask always saturates to 1. I used your celebA dataset to train GANimation as well, but the expression never changed.

@uniBruce Hello,do you correct these problmes in the Ganimation rather than this ganimaiton_replicate?I meet the same problems in the original program from the paper but this replicate work doesn't meet.

uniBruce commented 5 years ago

Hi, it's me again. I noticed that you made the lambda_mask and lambda_tv both 0, could you explain it? Because these two parameters really brought me much trouble when I replicated the GANimation. The attention mask always saturates to 1. I used your celebA dataset to train GANimation as well, but the expression never changed.

@uniBruce Hello,do you correct these problmes in the Ganimation rather than this ganimaiton_replicate?I meet the same problems in the original program from the paper but this replicate work doesn't meet.

I tried a few times according to what the ganimation replicate author told me, but got almost no changes.