neuralchen / SimSwap

An arbitrary face-swapping framework on images and videos with one single trained model!
Other
4.55k stars 895 forks source link

Training with the VGGFace2-224 dataset results in white face. #277

Open PeterLorre opened 2 years ago

PeterLorre commented 2 years ago

Firstly, I wanted to say thank you so very much for sharing your repo. I've had a lot of fun testing out SimSwap. It's great at parties.

I noticed the readme says that you highly recommend training, so I took a crack at it. Unfortunately, my training hasn't produced great results. As in issue #255 , I too am seeing a white face for some odd reason. I trained about 600k, with a batch size of 23 on a T100. It took a great deal of time and effort and I would just caution anyone thinking about training, as I am fairly disappointed that I could not get it to work.

  1. Has anyone managed to successfully train and utilize their checkpoints for 224 or 512? If so, could you kindly share those with the community or provide more detailed instructions for getting it to work correctly?

  2. I'm sure there is much anticipation for SimSwap++. Are there plans to release it shortly or should we focus on getting the training to work properly?

  3. Could anyone recommend any other data sets that they've trained and used with SimSwap?

Thank you.

BeaverInGreenland commented 2 years ago

Hello,

  1. I think that you trained your models in the proper way. Just check the samples folder that contains the validation (?) results of each checkpoint to get the proof that your models predict properly. The problem is with the test scripts that aren't suited for our own trained models. There seems to be a discrepancy between the train script hyper-parameters and those of the test scripts. I hope that the authors of this awesome project will repair this issue. In the meantime, you need to create a new test script inspired from the train script and just extract the code snippet that yields the contents of the samples folder.

Have a nice day !