Closed jwc666 closed 1 year ago
Hi @jwc666, 1.) That depends on the amount of training data. Our MS1M dataset contains 5822653 images, with a minibatch size of 256, that ends up to 22745 batches for each epoch. In fact, we used 22722 batches for each epoch due to less than 2 images per ID and our algorithm not finding enough unused images for the last batches. 2.) We use a random distribution of low resolutions (7px, 14px, and 28px) within each batch.
Dear @Martlgap , first of all thank you for this magnificent work you have done. I wanted to ask you two questions. 1.How many batches are there in one epoch。(I don't know how you generate minibatch. But I randomly selected different identity pictures as a minibatch, so I need to know this.) 2.Whether the resolution of low resolution pictures in one batch is random, or the resolution of low resolution pictures in one batch is the same, but different batches are different ? Thank you in advance for your reply.