Hi, I run the following code to make validation on trained Attn Generator:
python main.py --cfg cfg/eval_bird.yml --gpu 1
but resulted in getting 2928 images instead of 2933 reported in paper "Statistics of datasets" and using dataset.len (2933 as well)
I think Pytorch Dataloader initialization param "drop_last" is the key. However, when I set it to False, the code raised an error saying:
RuntimeError: Expected hidden[0] size (2, 5, 128), got (2, 16, 128)
indicating that the last batch failed and batch_size instead of input.size[0] is used to struct the model, which results in that the number of generated images varies with different batch_size setup.
I set the batch_size to 7 and get 2933 images. So why the batch_size in the public eval_bird.yml is set to 16? I mean, is there any setup that can allow me to use an arbitrary batch_size?
Hi, I run the following code to make validation on trained Attn Generator:
python main.py --cfg cfg/eval_bird.yml --gpu 1
but resulted in getting 2928 images instead of 2933 reported in paper "Statistics of datasets" and using dataset.len (2933 as well)I think Pytorch Dataloader initialization param "drop_last" is the key. However, when I set it to False, the code raised an error saying:
RuntimeError: Expected hidden[0] size (2, 5, 128), got (2, 16, 128)
indicating that the last batch failed and batch_size instead of input.size[0] is used to struct the model, which results in that the number of generated images varies with different batch_size setup.How can I solve this issue?