Open Carlyx opened 2 years ago
Hi @Carlyx, Question 2) If you see carefully, there are 2 batch_sizes in the hparams.py, So the batch_size you are using is 16, not 64, that's why its showing 2865.
Question 1) You may try changing this eval_steps variable here
Hi, Your work is amazing!
Q1: I train the Wav2Lip model with LRS2. But during the training process, "eval_model"(https://github.com/Rudrabha/Wav2Lip/blob/b9759a3467cb1b7519f1a3b91f5a84cb4bc1ae4a/wav2lip_train.py#L251) takes about 30 minutes (on one V100), is this normal?
Q2: In LRS2, the train.txt has 45839 lines, and the batch_size of train_data_loader is 64. So len(train_data_loader) = 45839/64 =717. But in the loop of https://github.com/Rudrabha/Wav2Lip/blob/b9759a3467cb1b7519f1a3b91f5a84cb4bc1ae4a/wav2lip_train.py#L210, I got an output of over 717. L1: 0.026415626978400907, Sync Loss: 0.0: : 2865it [2:04:31, 3.36it/s] I'm not particularly sure why this is happening, any advice for me please?