deepfakes / faceswap-playground

User dedicated repo for the faceswap project
306 stars 194 forks source link

How to caculate GPU RAM needed by different batchs? Does batchsize affect model effect? #274

Closed pengyang closed 5 years ago

pengyang commented 5 years ago

Question is :

  1. How to caculate video RAM which is needed by different batchsize?
  2. How about batchsize affect to the trained model performance? batchsize =64 or 128 is better?

Expected behavior

GPU is GTX970 4GB video ram, expect to run faceswap training normally with default batchsize=64. Other infor : Image size to train is 256*256

Actual behavior

Run faceswap training with default batchsize=64, get OOM error. Then, run training with -bs 32, works well.

Steps to reproduce

NA

Other relevant information

bryanlyon commented 5 years ago

There is no way to predict due to your system, tensorflow certain, running apps and more. You just have to experiment to find the batch sizes that work.