EvelynFan / FaceFormer

[CVPR 2022] FaceFormer: Speech-Driven 3D Facial Animation with Transformers
MIT License
778 stars 133 forks source link

Why batch_size = 1 #32

Open bezorro opened 2 years ago

bezorro commented 2 years ago

Hi, I find that batch_size = 1(refer to link). Is there any reason? I think it may be faster with a larger batch_size

TonyLRJ commented 2 years ago

Hi,because the lengths of datas are different,the collate() in dataloader can't work if batchsize!=1.

bezorro commented 2 years ago

Hi,because the lengths of datas are different,the collate() in dataloader can't work if batchsize!=1.

Thanks @TonyLRJ , so we can implement collate_fn of the dataloader to support the collecting data with different lengths? If the answer is YES, I will try to fix it. Are there any other reason? e.g. the model may not able to forward with batch_size!=1?

oliver8459 commented 1 year ago

Hi,because the lengths of datas are different,the collate() in dataloader can't work if batchsize!=1.

Thanks @TonyLRJ , so we can implement collate_fn of the dataloader to support the collecting data with different lengths? If the answer is YES, I will try to fix it. Are there any other reason? e.g. the model may not able to forward with batch_size!=1?

Hi, have you fixed this problem? And could you please share the solution?

icech commented 1 year ago

I found the main difficult point is that how to process the batch mode of linear_interpolation with padding.