Closed shugerdou closed 1 year ago
It was trained for 22,000 steps.
Hello @holynski, does the 22,000 steps refer to the number of optimizer steps or the number of training batches? As in your training config, the gradients are accumulated between every 4 batches, that amounts to a 4x difference. Thank you!
Hi. I am also confused about the training steps. The paper says 10000 steps with batchsize 1024. But the default setting is batchsize 32*8=256? And the epoch number is 2000, which is too large?
The model name instruct-pix2pix-00-22000.ckpt looks like it was trained for 22 epochs. But when I trained the model, it was trained more than that. Just want to confirm how many epochs we shall train to get the final results. Thanks.
Hi,
I need to download the data set from China mainland, but I'm finding it very slow. Even with a network proxy, the speed will not exceed 100K/s. But I need to download the dataset as soon as possible.
Have you downloaded the data set, and could you send me a copy if you can?
The model name instruct-pix2pix-00-22000.ckpt looks like it was trained for 22 epochs. But when I trained the model, it was trained more than that. Just want to confirm how many epochs we shall train to get the final results. Thanks.