Open phuang17 opened 6 years ago
I have never tried batch_size > 1 due to GPU memory limit, but this should have worked. I will make changes so that batch_size > 1 also works.
If you have multi gpus, can you please try batch_size > 1 on multi gpu considering 4-6 days training time. May shorten the training time.
I have never tried batch_size > 1 due to GPU memory limit, but this should have worked. I will make changes so that batch_size > 1 also works.