Closed wangherr closed 2 years ago
In the readme, train vae on 'celeba'
celeba size: 3*256*256 len: 27000 gpus: 16 batch: 4 epochs: 200
if I want to train other dataset, such as afhq:
afhq size: 3*256*256 len: 5153 gpus: 16 batch: 4 epochs: 200
the length of two dataset is not equal, it's not suitable to just add epochs due to scheduler.
scheduler
could you please give some suggestions?
In the readme, train vae on 'celeba'
if I want to train other dataset, such as afhq:
the length of two dataset is not equal, it's not suitable to just add epochs due to
scheduler
.could you please give some suggestions?